From 194f90eeac1b7caa9f0deab762f3e87a3d75eaa5 Mon Sep 17 00:00:00 2001 From: Claude Date: Mon, 23 Feb 2026 08:20:33 +0000 Subject: [PATCH 1/5] docs: rewrite CLAUDE.md for clarity and conciseness Remove verbose implementation status changelog and redundant module listing. Focus on architecture patterns, startup flow, plugin system, and practical build/test commands that help a new Claude instance be productive quickly. Fix test count (154, not 79). https://claude.ai/code/session_01W29EWLxxuZuRnfP9pJjxuB --- CLAUDE.md | 338 ++++++++++++++---------------------------------------- 1 file changed, 89 insertions(+), 249 deletions(-) diff --git a/CLAUDE.md b/CLAUDE.md index 2c900b5..f60ba70 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -4,285 +4,125 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co ## Rules -- **Every time you make changes to the codebase**, update this CLAUDE.md file to reflect the new state (implementation status, module structure, known issues, etc.). +- **Every time you make changes to the codebase**, update this CLAUDE.md file to reflect the new state (architecture, known issues, etc.). - **If changes affect user-facing features, architecture, or usage**, also update README.md accordingly. ## Project Overview -**Zero Layer (ZL)** is a universal Linux package manager with native binary translation, written in Rust. It installs packages from any source (pacman, apt, rpm, AppImage, GitHub releases, pip, npm, cargo, etc.) on any Linux system by translating them natively — no containers, VMs, or isolation layers. After installation, translated packages are indistinguishable from native ones. Zero runtime overhead: all translation happens at install time. +**Zero Layer (ZL)** is a universal Linux package manager with native binary translation, written in Rust. It installs packages from any source (pacman, apt, AUR, GitHub releases) on any Linux system by translating them natively — no containers, VMs, or isolation layers. All translation happens at install time; installed packages run with zero overhead. **Binary name**: `zl` +**Rust edition**: 2024 (requires Rust 1.85+) **License**: GPL v3 ## Build Commands ```bash -cargo build # Build the project -cargo run -- # Run (e.g., cargo run -- install firefox) -cargo test # Run all tests -cargo test # Run a single test by name -cargo clippy # Lint with clippy -cargo fmt # Format code -cargo fmt -- --check # Check formatting without modifying files +cargo build # Debug build +cargo build --release # Release build +cargo run -- # Run (e.g., cargo run -- install firefox) +cargo test # Run all tests (154 tests: 74 bin + 80 lib) +cargo test # Run a single test by name +cargo test -- --nocapture # Run tests with stdout visible +cargo clippy # Lint +cargo fmt # Format +cargo fmt -- --check # Check formatting without modifying ``` +There are no integration tests — all tests are unit tests inside `#[cfg(test)]` modules within source files. There is no CI configuration; run `cargo test && cargo clippy && cargo fmt -- --check` locally before committing. + ## Architecture -### How it works (install flow) -1. **Detect system** — auto-detect arch, interpreter, libc, lib dirs, layout (SystemProfile) -2. **Resolve** — recursive dependency resolution with cycle detection -3. **Check conflicts** — file ownership, binary/library name, version constraints, declared conflicts -4. **Download** packages in parallel (up to 4 threads) with progress bars and retry -5. **Verify** — SHA256 checksum + GPG signature verification (best-effort) -6. **Extract** and analyze all ELF binaries with `goblin` -7. **Patch** binaries with `elb` (interpreter, RUNPATH) using detected page size -8. **Remap** FHS paths to target system structure using dynamic prefix map -9. **Install** with atomic transaction — rollback on any failure -10. **Track** every file in a dependency graph and persistent database -11. **Verify** everything resolves correctly post-install (using detected lib dirs) - -### Module structure -``` -src/ - main.rs # Entry point: CLI parsing + SystemProfile detection + dispatch - lib.rs # Re-exports core public API - error.rs # ZlError enum (thiserror), ZlResult type alias, retry_with_backoff() - config.rs # Config parsing (~/.config/zl/config.toml), SystemConfig for overrides - paths.rs # ZL directory layout (~/.local/share/zl/) - system/ - mod.rs # SystemProfile struct — auto-detected host system profile - arch.rs # CPU architecture detection (x86_64, aarch64, armv7, riscv64, etc.) - interpreter.rs # Dynamic linker detection (reads PT_INTERP from /bin/sh via goblin) - libc.rs # C library detection (glibc/musl/bionic from interpreter name) - paths.rs # Host lib/bin path discovery (ldconfig, ld.so.conf, multiarch, NixOS) - detect.rs # System layout detection (FHS, MergedUsr, NixOS, Guix, Termux, GoboLinux) - cli/ - mod.rs # Cli struct (clap derive), Commands enum, all arg structs - deps.rs # Dependency resolution: recursive resolve, install plan, cycle detection - install.rs # Install: conflict check, parallel download, verification, transaction-wrapped install, multi-version support - remove.rs # Remove: delete files, symlinks, deps, DB entries, cascade orphans, version-specific removal - search.rs # Search subcommand handler - update.rs # Update: check newer versions, skip pinned, remove old + install new - upgrade.rs # Upgrade: mass upgrade all packages with summary + confirmation - list.rs # List: --explicit, --deps, --orphans filters - info.rs # Detailed package info: deps, reverse deps, disk usage, pin status - cache.rs # Cache management: list files + sizes, clean all cached files - completions.rs # Shell completions generation (bash/zsh/fish) via clap_complete - pin.rs # Pin/unpin packages to prevent updates - lockfile.rs # Export/import installed packages as JSON lockfile - selfupdate.rs # Self-update: download + replace ZL binary from GitHub releases - env.rs # Ephemeral environments: temporary/named isolated shells - core/ - build/ - mod.rs # BuildSpec, BuildSystem enum, detect_build_system(), build_package() - systems.rs # Build system implementations: autotools, cmake, meson, cargo, make - elf/ - analysis.rs # Read ELF metadata with goblin (interpreter, needed libs, rpath, soname) - patcher.rs # Patch ELF with elb (set interpreter, set runpath) — uses profile.page_size - path/ - mod.rs # PathMapping struct — dynamic FHS-to-ZL path translation via SystemProfile - remapper.rs # Rewrite paths in text files and shebangs - transaction.rs # Atomic install transaction: tracks files/dirs/symlinks/DB, rollback on failure - conflicts.rs # Conflict detection: file ownership, binary/lib name, version constraints, declared conflicts - verify.rs # Package verification: SHA256 checksum + GPG signature (best-effort) - graph/ - model.rs # PackageId, PackageNode, DependencyEdge, DepGraph (petgraph) - resolver.rs # Topological sort, cycle detection, orphan detection - verifier.rs # Post-install verification — uses profile.lib_dirs for lib search - db/ - schema.rs # redb table definitions (PACKAGES, FILE_OWNERS, LIB_INDEX, DEPENDENCIES, PINNED) - ops.rs # CRUD: packages, file ownership, lib index, dependencies, pinning, multi-version queries, plugin metadata - plugin/ - mod.rs # SourcePlugin trait, PackageCandidate, ExtractedPackage, PluginRegistry - pacman/ - mod.rs # PacmanPlugin: SourcePlugin + provides-based resolution - mirror.rs # Mirror list parsing and URL construction - database.rs # Sync DB download/parsing with retry (pacman desc format) - package.rs # .pkg.tar.zst download with retry, extraction, .PKGINFO parsing - aur/ - mod.rs # AurPlugin: live AUR RPC v5 queries, git clone + makepkg build - apt/ - mod.rs # AptPlugin: Packages.gz sync, per-distro mirror/suite config - index.rs # APT Packages index parser (RFC 2822-like format) - deb.rs # .deb extraction (ar → data.tar.{gz,xz,zst,bz2}) + download - github/ - mod.rs # GithubPlugin: GitHub Releases API, smart asset selection, extract -``` +### Install flow (the core operation) + +1. **Detect system** — `SystemProfile::detect()` auto-detects arch, dynamic linker, libc, lib dirs, filesystem layout +2. **Resolve source** — if `--from` omitted, queries ALL plugins in parallel; user picks if multiple hits (`pick_source()` in `cli/install.rs`) +3. **Resolve deps** — recursive dependency resolution with cycle detection (`cli/deps.rs`) +4. **Check conflicts** — 5 types: file ownership, binary name, library soname, declared conflicts, version constraints (`core/conflicts.rs`) +5. **Download** in parallel (4 threads via `thread::scope`) with progress bars and retry +6. **Verify** — SHA256 checksum + GPG signature (best-effort) +7. **Extract** and analyze ELF binaries with `goblin` +8. **Patch** ELF binaries with `elb` (set interpreter, RUNPATH) using detected page size +9. **Remap** FHS paths to ZL-managed directories (`core/path/`) +10. **Install** atomically — `Transaction` tracks all changes; rollback on any failure +11. **Track** in redb database + dependency graph + +### Startup flow (`main.rs`) + +`main()` → parse CLI → init tracing → `run()`: +1. Early-exit commands (`completions`, `self-update`) run without setup +2. Load config from `~/.config/zl/config.toml` +3. Detect `SystemProfile` and apply config overrides +4. Create `ZlPaths`, ensure directory structure exists +5. Open `ZlDatabase` (redb) +6. Register all plugins: pacman → aur → apt → github (order = priority) +7. Dispatch to command handler ### Key abstractions -- **SystemProfile** (`system/mod.rs`): auto-detected host profile (arch, interpreter, libc, lib dirs, layout). Built once at startup, passed to all modules. Replaces all hardcoded FHS assumptions. -- **SourcePlugin trait** (`plugin/mod.rs`): interface every package source implements (search, resolve, download, extract, sync) -- **InstallPlan** (`cli/deps.rs`): result of recursive dependency resolution — ordered list of packages to install -- **PathMapping** (`core/path/mod.rs`): maps FHS paths to ZL-managed paths for a specific package, using SystemProfile -- **Transaction** (`core/transaction.rs`): atomic install transaction tracking filesystem + DB changes with rollback support -- **ConflictReport** (`core/conflicts.rs`): pre-install conflict detection (file ownership, binary/library names, version constraints, declared conflicts) -- **VerifyResult** (`core/verify.rs`): package integrity verification — SHA256 checksum + GPG signature -- **BuildSpec/BuildSystem** (`core/build/mod.rs`): source build specification with auto-detection of build systems -- **DepGraph** (`core/graph/model.rs`): petgraph-based dependency graph tracking all packages and their relationships -- **ZlDatabase** (`core/db/ops.rs`): redb-based persistent storage for installed packages, file ownership, library index, dependency tracking, package pinning, multi-version queries -### Key crates -| Crate | Purpose | -|-------|---------| -| `goblin` | Read ELF metadata (interpreter, needed libs, rpath, soname) | -| `elb` | Patch ELF binaries (set interpreter, set runpath) — pure Rust patchelf | -| `petgraph` | Dependency graph with topological sort, cycle detection | -| `redb` | Embedded key-value database (pure Rust, ACID, no C deps) | -| `libc` | System detection (sysconf for page size) | -| `clap` (derive) | CLI argument parsing | -| `clap_complete` | Shell completions generation (bash/zsh/fish) | -| `indicatif` | Progress bars for downloads and installs | -| `reqwest` (blocking+json) | HTTP downloads + JSON deserialization | -| `tar` + `zstd` + `flate2` | Archive extraction | -| `xz2` | XZ decompression (.tar.xz, Packages.xz) | -| `ar` | Ar archive reading (.deb format) | -| `bzip2` | BZip2 decompression (data.tar.bz2 in old .deb) | -| `zip` | Zip extraction (GitHub release assets) | -| `sha2` | SHA256 checksums for package verification | +- **`SystemProfile`** (`system/mod.rs`): Host profile (arch, interpreter, libc, lib dirs, layout). Built once, threaded through all modules. Replaces all hardcoded FHS assumptions. +- **`SourcePlugin` trait** (`plugin/mod.rs`): Interface every package source implements — `name()`, `search()`, `resolve()`, `download()`, `extract()`, `sync()`. Plugins are compile-time modules with trait objects, not dynamic libraries. +- **`Transaction`** (`core/transaction.rs`): Atomic install — tracks files/dirs/symlinks/DB entries created during install, rolls back everything on failure. +- **`DepGraph`** (`core/graph/model.rs`): petgraph-based dependency graph with topological sort, cycle detection, orphan detection. +- **`ZlDatabase`** (`core/db/ops.rs`): redb-based persistent store. Tables: PACKAGES, FILE_OWNERS, LIB_INDEX, DEPENDENCIES, PINNED, PLUGIN_METADATA. +- **`PathMapping`** (`core/path/mod.rs`): Dynamic FHS-to-ZL path translation using SystemProfile. +- **`PackageCandidate` / `ExtractedPackage`** (`plugin/mod.rs`): Common types shared across all plugins for package metadata and extracted content. -### ZL directory layout (runtime) -``` -~/.local/share/zl/ - bin/ # Symlinks to executables (user adds to PATH) - lib/ # Shared libraries (never duplicated) - share/ # Shared data files - etc/ # Config files - packages/ # Per-package directories (name-version/) - cache/ # Download cache - envs/ # Ephemeral/named environment roots - zl.redb # Package database (PACKAGES, FILE_OWNERS, LIB_INDEX, DEPENDENCIES, PINNED tables) -``` +### Plugin system -### Design decisions -- **Single crate, no workspace**: plugins are compile-time modules with trait objects, not dynamic libraries -- **Dynamic system detection over hardcoded paths**: interpreter detected from /bin/sh's PT_INTERP, lib dirs from ldconfig + ld.so.conf, layout auto-classified -- **RUNPATH over RPATH**: modern standard, respects LD_LIBRARY_PATH -- **redb over SQLite**: pure Rust, maintains single-binary zero-deps constraint -- **elb over shelling out to patchelf**: pure Rust, no external dependency -- **Parallel downloads with thread::scope**: up to 4 concurrent downloads, no tokio needed -- **Retry with exponential backoff**: all HTTP operations retry up to 3 times (1s, 2s, 4s) -- **Atomic transactions**: installs are wrapped in Transaction; on failure, all filesystem + DB changes are rolled back -- **Pre-install conflict detection**: 5 types of conflicts checked before any files are written -- **Side-by-side versions**: multiple versions of the same package can coexist, with `zl switch` to change the active one -- **Ephemeral environments**: temporary isolated shells where packages disappear on exit +All plugins implement `SourcePlugin` and are registered in `main.rs`. To add a new plugin: +1. Create `src/plugin//mod.rs` implementing `SourcePlugin` +2. Add `pub mod ;` in `src/plugin/mod.rs` +3. Instantiate and register in `main.rs`'s `run()` function -### SystemProfile detection chain -1. **Architecture**: `std::env::consts::ARCH` (compile-time, always correct for running binary) -2. **Page size**: `libc::sysconf(_SC_PAGESIZE)` (never hardcoded — 4K on x86_64, up to 64K on aarch64) -3. **Dynamic linker**: Read PT_INTERP from `/bin/sh` using goblin. Works on ANY distro. -4. **C library**: Derived from interpreter filename (`ld-linux-*` → glibc, `ld-musl-*` → musl) -5. **Library paths**: Combined from `ldconfig -p`, `/etc/ld.so.conf`, `LD_LIBRARY_PATH`, layout-specific dirs -6. **Layout**: Detected by filesystem markers (NixOS: `/nix/store`, Guix: `/gnu/store`, Termux: `$PREFIX`, merged: `/bin` → `/usr/bin`) +Current plugins: `pacman` (Arch repos), `aur` (AUR RPC v5 + makepkg), `apt` (Packages.gz + .deb), `github` (Releases API). -### Supported distros/layouts -- Standard FHS (Fedora, RHEL, SUSE, Void, etc.) -- Merged /usr (Arch, Ubuntu 22+, Fedora 33+, Debian 12+) -- Debian multiarch (`/usr/lib/x86_64-linux-gnu/`) -- Alpine/Void musl -- NixOS (`/nix/store`) -- GNU Guix (`/gnu/store`) -- Termux on Android -- GoboLinux -- Any architecture: x86_64, aarch64, armv7, riscv64, i686, s390x, ppc64le +### Command dispatch pattern -## Implementation Status +Each CLI command lives in `src/cli/.rs` with a `pub fn handle(...)` function. The `handle` function receives the parsed args struct plus shared state (`ZlPaths`, `ZlDatabase`, `PluginRegistry`, `SystemProfile`, flags). Commands are dispatched via a `match` in `main.rs`. -### Phase 1: Core + Pacman plugin + Universal distro support (complete) -- [x] Project skeleton: Cargo.toml with all dependencies, full directory structure -- [x] `error.rs` — ZlError enum with all variants, ZlResult alias, retry_with_backoff() -- [x] `paths.rs` — ZlPaths struct with ensure_dirs() -- [x] `config.rs` — ZlConfig, GeneralConfig, SystemConfig, PluginConfig deserialization -- [x] `main.rs` — Full bootstrap: config, SystemProfile detection, paths, DB, plugin registry, CLI dispatch -- [x] `lib.rs` — Re-exports core modules including system -- [x] `system/` — Full SystemProfile detection (arch, interpreter, libc, lib dirs, layout) -- [x] `core/elf/` — ELF analysis (goblin) and patching (elb) with dynamic page size -- [x] `core/path/` — PathMapping with dynamic prefix map, script/shebang remapping -- [x] `core/graph/` — DepGraph, topological sort, cycle detection, orphan detection, verification -- [x] `core/db/` — redb tables, CRUD for packages/files/libs/deps/plugin metadata -- [x] `plugin/pacman/` — Full PacmanPlugin: sync, search, resolve, download, extract +### Error handling -### Phase 2: Dep resolution + Parallel downloads + Source builds + Error handling (complete) -- [x] `cli/deps.rs` — Recursive dependency resolution with cycle detection, install plan display -- [x] `cli/install.rs` — Full dep resolution, parallel downloads (4 threads), sequential install -- [x] `cli/remove.rs` — Improved orphan detection using dep table + lib needs -- [x] `cli/update.rs` — Uses install_single_package(), only updates explicit packages -- [x] `core/build/` — BuildSpec, BuildSystem enum, detect + build for autotools/cmake/meson/cargo/make -- [x] `core/db/ops.rs` — Dependency tracking: register, get, reverse lookup, remove -- [x] `plugin/pacman/mod.rs` — Provides-based virtual package resolution -- [x] `plugin/pacman/package.rs` — Download with retry, checksum cache verification -- [x] `plugin/pacman/database.rs` — DB sync with retry +- `ZlError` enum in `error.rs` (thiserror) for domain errors with `.suggestion()` hints +- `anyhow::Result` at the top level (`run()` returns `anyhow::Result<()>`) +- `retry_with_backoff()` in `error.rs` for HTTP retries (3 attempts: 1s, 2s, 4s) +- Tracing: default level `warn`; `-v` = info, `-vv` = debug -### Phase 3: Safety, UX, and package management features (complete) -- [x] `core/transaction.rs` — Atomic install transactions: track files/dirs/symlinks/DB entries, rollback on failure -- [x] `core/conflicts.rs` — Pre-install conflict detection: file ownership, binary name, library soname, declared conflicts, version constraints (with component-by-component comparison) -- [x] `core/db/schema.rs` — Added PINNED table for package pinning -- [x] `core/db/ops.rs` — Added pin_package(), unpin_package(), is_pinned(), list_pinned() -- [x] `cli/install.rs` — Integrated: conflict checking, transaction-wrapped installs, progress bars (indicatif) -- [x] `cli/update.rs` — Respects pinned packages (skips them during updates) -- [x] `cli/info.rs` — Detailed package info: name, version, source, status, deps, reverse deps, disk usage -- [x] `cli/cache.rs` — Cache management: `zl cache list` (files + sizes), `zl cache clean` (free space) -- [x] `cli/completions.rs` — Shell completions: `zl completions bash/zsh/fish` via clap_complete -- [x] `cli/list.rs` — Enhanced: `--explicit`, `--deps`, `--orphans` filters, pin status display -- [x] `cli/pin.rs` — Pin/unpin packages: `zl pin `, `zl unpin ` -- [x] `cli/lockfile.rs` — Lockfile: `zl export [file]`, `zl import ` (JSON format) -- [x] `cli/mod.rs` — All new commands wired up: Info, Cache, Completions, Pin, Unpin, Export, Import +### Key design constraints -### Phase 4: Security, multi-version, environments, and mass upgrade (complete) -- [x] `core/verify.rs` — Package verification: SHA256 checksum validation + GPG signature verification (best-effort, uses system gpg when available) -- [x] `cli/install.rs` — Integrated verification pipeline: all downloads verified before install, `--skip-verify` flag to bypass -- [x] `cli/install.rs` — Dry-run mode (`--dry-run` / `--simulate`): shows install plan without making changes -- [x] `cli/remove.rs` — Dry-run support, version-specific removal (`--version`), multi-version aware -- [x] `cli/update.rs` — Dry-run support, integrated verification -- [x] `cli/upgrade.rs` — Mass upgrade: `zl upgrade` checks all packages, shows summary, confirms, upgrades in batch. `--check` for preview-only, `--from` to filter by source -- [x] `cli/install.rs` — Multi-version support: install multiple versions side-by-side (e.g., `zl install firefox --version 120.0` and `--version 121.0`) -- [x] `cli/install.rs` — `zl switch `: change which version's binaries are active (symlinked to bin/) -- [x] `core/db/ops.rs` — Added `get_all_versions(name)`: query all installed versions of a package -- [x] `cli/selfupdate.rs` — Self-update: `zl self-update` downloads latest release from GitHub, verifies architecture, atomically replaces binary -- [x] `cli/env.rs` — Ephemeral environments: `zl env shell [name]` spawns an isolated shell with its own ZL root. Temporary envs are auto-deleted on exit. Named envs persist. -- [x] `cli/env.rs` — Environment management: `zl env list`, `zl env delete ` -- [x] `cli/mod.rs` — Global flags: `--dry-run`/`--simulate`, `--skip-verify` -- [x] `error.rs` — New error variants: GpgVerification, SelfUpdate, Environment +- **Single binary, zero C deps**: redb over SQLite, elb over patchelf, no tokio (thread::scope for parallelism) +- **Dynamic detection over hardcoded paths**: interpreter from /bin/sh's PT_INTERP, lib dirs from ldconfig + ld.so.conf +- **RUNPATH over RPATH**: modern standard, respects LD_LIBRARY_PATH +- **Atomic transactions**: every install is wrapped; failure = full rollback -### Phase 5c: Error handling + XDG integration (complete) -- [x] `core/db/ops.rs` — DB init: propagate `open_table` errors instead of silently swallowing -- [x] `cli/install.rs` — Script remap failures now logged with `tracing::warn!` instead of `let _ =` -- [x] `cli/install.rs` — Thread join panic handled without `unwrap()`; poisoned Mutex recovered with `into_inner()` -- [x] `cli/install.rs` — `install_xdg_assets()`: symlinks `.desktop` files to `$XDG_DATA_HOME/applications/`, icons to `$XDG_DATA_HOME/icons/` (full tree). Uses `dirs::data_local_dir()` — works on all distros -- [x] `cli/install.rs` — `patch_desktop_exec()`: rewrites `Exec=` in .desktop files to strip absolute path prefix so binary is found via PATH -- [x] `cli/selfupdate.rs` — Distinct error message for 404 (no releases) vs network errors -- [x] `error.rs` — Better hint for `SelfUpdate` permission denied: suggests `sudo zl self-update` -- [x] `cli/mod.rs` — `verbose` changed from `bool` to `u8` (`ArgAction::Count`): `-v` = info, `-vv` = debug -- [x] `main.rs` — Default log level changed from `info` to `warn` (clean output by default) -- [x] `cli/selfupdate.rs` — Fixed GitHub repo URL: `supercosti21/zero_layer` (was `zero-layer/zl`) +### ZL directory layout (runtime) -### Phase 5b: Interactive multi-source selection (complete) -- [x] `cli/install.rs` — `pick_source()`: when `--from` is omitted, resolves from ALL plugins in parallel, then: - - 0 results → `PackageNotFound` error - - 1 result → auto-selects (no prompt) - - N results + `--yes` → picks first (highest-priority plugin, i.e. pacman) - - N results → shows `dialoguer::Select` for interactive choice -- [x] `cli/install.rs` — `handle()` now determines `from: String` before syncing, so `plugin.sync()` and `deps::resolve_with_deps()` always use a known source name +``` +~/.local/share/zl/ + bin/ # Symlinks to executables (user adds to PATH) + lib/ # Shared libraries + share/ # Shared data files + etc/ # Config files + packages/ # Per-package directories (name-version/) + cache/ # Download cache + envs/ # Ephemeral/named environment roots + zl.redb # Package database +``` -### Phase 5: New plugins — AUR, APT, GitHub Releases (complete) -- [x] `plugin/aur/mod.rs` — AurPlugin: live AUR RPC API v5 (search/resolve), git clone + makepkg build. `zl install yay --from aur` -- [x] `plugin/apt/index.rs` — APT Packages index parser: RFC 2822-like format, dep list parsing, short-description extraction -- [x] `plugin/apt/deb.rs` — .deb extraction: ar → data.tar.{gz,xz,zst,bz2}, SHA256 cache validation, retry download -- [x] `plugin/apt/mod.rs` — AptPlugin: Packages.gz sync per component, in-memory DB, configurable mirror/suite/arch. `zl install vim --from apt` -- [x] `plugin/github/mod.rs` — GithubPlugin: GitHub Releases API, smart asset scoring (arch, musl, format), extract tar.gz/tar.xz/zip/AppImage/bare binary. `zl install sharkdp/bat --from github` -- [x] `Cargo.toml` — Added: `ar`, `bzip2`, `zip`, `xz2`, reqwest `json` feature -- [x] `plugin/mod.rs` — Added `pub mod aur; pub mod apt; pub mod github;` -- [x] `main.rs` — All three plugins registered at startup (AurPlugin, AptPlugin, GithubPlugin) -- [x] All tests pass: **79 tests** (was 72 → +7 new: 3 apt::index, 1 aur, 3 github) +### Key crates -### All tests pass: 79 tests +| Crate | Purpose | +|-------|---------| +| `goblin` | Read ELF metadata (interpreter, needed libs, rpath, soname) | +| `elb` | Patch ELF binaries (set interpreter, set runpath) | +| `petgraph` | Dependency graph with topological sort, cycle detection | +| `redb` | Embedded key-value database (pure Rust, ACID) | +| `clap` (derive) | CLI argument parsing | +| `reqwest` (blocking+json) | HTTP client | +| `tar` + `zstd` + `flate2` + `xz2` + `bzip2` + `ar` + `zip` | Archive formats | +| `sha2` | SHA256 checksums | +| `indicatif` + `dialoguer` | Progress bars and interactive prompts | -### Removed -- `core/path/fhs.rs` — replaced by `system/` module. No more hardcoded FHS constants. +### Known compiler warnings -### Future work (Phase 6+) -- [ ] Additional plugins: RPM, AppImage, pip, npm, cargo -- [ ] Cross-OS support (macOS/Homebrew via HostAdapter trait) -- [ ] Interactive conflict resolution (dialoguer is already in deps) -- [ ] Async HTTP for even faster parallel downloads -- [ ] Hook system: pre/post install scripts -- [ ] TUI interactive mode for search and selection +There are existing dead-code warnings for `core/build/` (build system support is implemented but not yet wired into any plugin) and a few unused fields/functions in `config.rs` and `cli/completions.rs`. These are expected — the build system will be used by future plugins. From a8b5b848175e12276f2481b4a425a205cbd75a52 Mon Sep 17 00:00:00 2001 From: Claude Date: Mon, 23 Feb 2026 09:00:52 +0000 Subject: [PATCH 2/5] refactor: major code quality improvements across the codebase MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit - Box ZlError::ElfAnalysis to eliminate 121 result_large_err warnings - Remove 7 dead ZlError variants (Database, Network, PathRemap, etc.) - Add AppContext struct to bundle shared state, replacing 8-10 param functions - Fix all clippy warnings (225 → 0): collapsible_if, for_kv_map, dead_code, etc. - Fix real bug in remap_shebang() that prevented shebangs from ever being remapped - Add 32 new tests (154 → 186): core/elf/analysis, core/path, core/path/remapper - Add GitHub Actions CI workflow (fmt + clippy -D warnings + test) - Update CLAUDE.md to reflect current state https://claude.ai/code/session_01W29EWLxxuZuRnfP9pJjxuB --- .github/workflows/ci.yml | 20 ++++ CLAUDE.md | 18 ++-- src/cli/completions.rs | 1 + src/cli/deps.rs | 1 + src/cli/install.rs | 186 +++++++++++++++++++----------------- src/cli/list.rs | 4 +- src/cli/mod.rs | 16 ++++ src/cli/remove.rs | 44 ++++----- src/cli/selfupdate.rs | 3 +- src/cli/update.rs | 48 ++++------ src/cli/upgrade.rs | 56 +++++------ src/config.rs | 2 + src/core/build/mod.rs | 2 + src/core/build/systems.rs | 2 + src/core/conflicts.rs | 137 +++++++++++++------------- src/core/db/ops.rs | 6 +- src/core/elf/analysis.rs | 59 +++++++++++- src/core/elf/patcher.rs | 29 +++--- src/core/graph/model.rs | 10 +- src/core/graph/resolver.rs | 11 ++- src/core/graph/verifier.rs | 1 + src/core/path/mod.rs | 104 +++++++++++++++++++- src/core/path/remapper.rs | 87 ++++++++++++++++- src/core/transaction.rs | 30 +++--- src/core/verify.rs | 3 + src/error.rs | 60 ++---------- src/main.rs | 111 ++++++--------------- src/plugin/apt/deb.rs | 17 +++- src/plugin/apt/index.rs | 22 ++--- src/plugin/apt/mod.rs | 30 +++--- src/plugin/aur/mod.rs | 18 +++- src/plugin/github/mod.rs | 72 ++++++++++---- src/plugin/mod.rs | 6 +- src/plugin/pacman/mirror.rs | 1 + src/plugin/pacman/mod.rs | 22 +++-- src/system/arch.rs | 21 ++-- src/system/detect.rs | 18 ++-- src/system/mod.rs | 4 +- src/system/paths.rs | 11 +-- 39 files changed, 764 insertions(+), 529 deletions(-) create mode 100644 .github/workflows/ci.yml diff --git a/.github/workflows/ci.yml b/.github/workflows/ci.yml new file mode 100644 index 0000000..52a130a --- /dev/null +++ b/.github/workflows/ci.yml @@ -0,0 +1,20 @@ +name: CI + +on: + push: + branches: [main] + pull_request: + +env: + CARGO_TERM_COLOR: always + +jobs: + check: + runs-on: ubuntu-latest + steps: + - uses: actions/checkout@v4 + - uses: dtolnay/rust-toolchain@stable + - uses: Swatinem/rust-cache@v2 + - run: cargo fmt -- --check + - run: cargo clippy -- -D warnings + - run: cargo test diff --git a/CLAUDE.md b/CLAUDE.md index f60ba70..30ebe27 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -4,7 +4,7 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co ## Rules -- **Every time you make changes to the codebase**, update this CLAUDE.md file to reflect the new state (architecture, known issues, etc.). +- **Every time you make changes to the codebase**, update this CLAUDE.md file to reflect the new state (implementation status, module structure, known issues, etc.). - **If changes affect user-facing features, architecture, or usage**, also update README.md accordingly. ## Project Overview @@ -21,7 +21,7 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co cargo build # Debug build cargo build --release # Release build cargo run -- # Run (e.g., cargo run -- install firefox) -cargo test # Run all tests (154 tests: 74 bin + 80 lib) +cargo test # Run all tests (186 tests: 90 bin + 96 lib) cargo test # Run a single test by name cargo test -- --nocapture # Run tests with stdout visible cargo clippy # Lint @@ -29,7 +29,9 @@ cargo fmt # Format cargo fmt -- --check # Check formatting without modifying ``` -There are no integration tests — all tests are unit tests inside `#[cfg(test)]` modules within source files. There is no CI configuration; run `cargo test && cargo clippy && cargo fmt -- --check` locally before committing. +There are no integration tests — all tests are unit tests inside `#[cfg(test)]` modules within source files. + +**CI**: GitHub Actions workflow (`.github/workflows/ci.yml`) runs `cargo fmt --check`, `cargo clippy -- -D warnings`, and `cargo test` on every push to `main` and all PRs. ## Architecture @@ -79,11 +81,11 @@ Current plugins: `pacman` (Arch repos), `aur` (AUR RPC v5 + makepkg), `apt` (Pac ### Command dispatch pattern -Each CLI command lives in `src/cli/.rs` with a `pub fn handle(...)` function. The `handle` function receives the parsed args struct plus shared state (`ZlPaths`, `ZlDatabase`, `PluginRegistry`, `SystemProfile`, flags). Commands are dispatched via a `match` in `main.rs`. +Each CLI command lives in `src/cli/.rs` with a `pub fn handle(...)` function. Most `handle` functions receive the parsed args struct plus an `AppContext` reference (defined in `cli/mod.rs`), which bundles shared state: `ZlPaths`, `ZlDatabase`, `PluginRegistry`, `SystemProfile`, and flags (`auto_yes`, `dry_run`, `skip_verify`). Commands are dispatched via a `match` in `main.rs`. ### Error handling -- `ZlError` enum in `error.rs` (thiserror) for domain errors with `.suggestion()` hints +- `ZlError` enum in `error.rs` (thiserror, boxed where needed to keep size small) for domain errors with `.suggestion()` hints - `anyhow::Result` at the top level (`run()` returns `anyhow::Result<()>`) - `retry_with_backoff()` in `error.rs` for HTTP retries (3 attempts: 1s, 2s, 4s) - Tracing: default level `warn`; `-v` = info, `-vv` = debug @@ -123,6 +125,8 @@ Each CLI command lives in `src/cli/.rs` with a `pub fn handle(...)` fun | `sha2` | SHA256 checksums | | `indicatif` + `dialoguer` | Progress bars and interactive prompts | -### Known compiler warnings +### Code quality -There are existing dead-code warnings for `core/build/` (build system support is implemented but not yet wired into any plugin) and a few unused fields/functions in `config.rs` and `cli/completions.rs`. These are expected — the build system will be used by future plugins. +- **Zero clippy warnings**: `cargo clippy -- -D warnings` passes clean +- **Zero `cargo fmt` diff**: all code is formatted +- **186 tests**: comprehensive coverage of core modules (conflicts, ELF, path mapping, DB, graph, transaction, verify, plugins, system detection) diff --git a/src/cli/completions.rs b/src/cli/completions.rs index 7fee3b6..dbc889a 100644 --- a/src/cli/completions.rs +++ b/src/cli/completions.rs @@ -13,6 +13,7 @@ pub fn handle(args: CompletionsArgs) -> ZlResult<()> { } /// Print usage instructions for installing shell completions +#[allow(dead_code)] pub fn print_instructions(shell: Shell) { match shell { Shell::Bash => { diff --git a/src/cli/deps.rs b/src/cli/deps.rs index 9b04a8f..74dd2c9 100644 --- a/src/cli/deps.rs +++ b/src/cli/deps.rs @@ -99,6 +99,7 @@ pub fn resolve_with_deps( }) } +#[allow(clippy::too_many_arguments)] fn resolve_recursive( candidate: &PackageCandidate, explicit: bool, diff --git a/src/cli/install.rs b/src/cli/install.rs index ba52964..a16421c 100644 --- a/src/cli/install.rs +++ b/src/cli/install.rs @@ -20,22 +20,13 @@ use crate::plugin::{PackageCandidate, PluginRegistry, SourcePlugin}; use crate::system::SystemProfile; use super::deps; -use super::{InstallArgs, SwitchArgs}; +use super::{AppContext, InstallArgs, SwitchArgs}; /// Maximum number of concurrent downloads const MAX_PARALLEL_DOWNLOADS: usize = 4; -pub fn handle( - args: InstallArgs, - paths: &ZlPaths, - db: &ZlDatabase, - registry: &PluginRegistry, - profile: &SystemProfile, - auto_yes: bool, - dry_run: bool, - skip_verify: bool, -) -> ZlResult<()> { - if dry_run { +pub fn handle(args: InstallArgs, ctx: &AppContext) -> ZlResult<()> { + if ctx.dry_run { println!("[DRY-RUN] Simulating install of {}...", args.package); } @@ -44,15 +35,18 @@ pub fn handle( // Otherwise, resolve from all plugins and let the user pick. let from: String = match args.from.as_deref() { Some(f) => f.to_string(), - None => pick_source(&args.package, args.version.as_deref(), registry, auto_yes)?, + None => pick_source( + &args.package, + args.version.as_deref(), + ctx.registry, + ctx.auto_yes, + )?, }; - let plugin = registry - .get(&from) - .ok_or_else(|| ZlError::Plugin { - plugin: from.clone(), - message: "No matching source plugin found".into(), - })?; + let plugin = ctx.registry.get(&from).ok_or_else(|| ZlError::Plugin { + plugin: from.clone(), + message: "No matching source plugin found".into(), + })?; println!("Syncing package database from {}...", plugin.display_name()); plugin.sync()?; @@ -63,9 +57,9 @@ pub fn handle( &args.package, args.version.as_deref(), Some(&from), - db, - registry, - profile, + ctx.db, + ctx.registry, + ctx.profile, )?; if plan.packages.is_empty() { @@ -77,11 +71,11 @@ pub fn handle( println!("Checking for conflicts..."); let candidates_refs: Vec<&PackageCandidate> = plan.packages.iter().map(|e| &e.candidate).collect(); - let conflict_report = conflicts::check_conflicts(&candidates_refs, db, paths)?; + let conflict_report = conflicts::check_conflicts(&candidates_refs, ctx.db, ctx.paths)?; if conflict_report.has_conflicts() { conflict_report.display(); - if !auto_yes { + if !ctx.auto_yes { eprintln!("\nConflicts must be resolved before installing."); eprintln!(" hint: remove conflicting packages with `zl remove` first"); return Err(ZlError::PackageConflict { @@ -95,7 +89,7 @@ pub fn handle( // 4. Display install plan deps::display_plan(&plan); - if dry_run { + if ctx.dry_run { println!( "\n[DRY-RUN] Would install {} package(s). No changes made.", plan.packages.len() @@ -104,7 +98,7 @@ pub fn handle( } // 5. Confirm - if !auto_yes { + if !ctx.auto_yes { print!("\nProceed with installation? [Y/n] "); use std::io::Write; std::io::stdout().flush()?; @@ -122,7 +116,7 @@ pub fn handle( let candidates: Vec<&PackageCandidate> = plan.packages.iter().map(|e| &e.candidate).collect(); println!("\nDownloading {} package(s)...", total); - let archives = download_parallel(&candidates, plugin, &paths.cache)?; + let archives = download_parallel(&candidates, plugin, &ctx.paths.cache)?; // 7. Verify all downloads println!("Verifying packages..."); @@ -131,9 +125,9 @@ pub fn handle( archive_path, candidate.checksum.as_deref(), &candidate.download_url, - skip_verify, + ctx.skip_verify, )?; - if !skip_verify { + if !ctx.skip_verify { tracing::info!(" {} — {}", candidate.name, result.message); } } @@ -158,10 +152,10 @@ pub fn handle( &entry.candidate, archive_path, entry.explicit, - paths, - db, + ctx.paths, + ctx.db, plugin, - profile, + ctx.profile, &mut txn, ) { Ok(()) => { @@ -171,7 +165,7 @@ pub fn handle( install_pb.finish_and_clear(); eprintln!("\nFailed to install {}: {}", entry.candidate.name, e); eprintln!("Rolling back {} installed package(s)...", installed_count); - txn.rollback(db); + txn.rollback(ctx.db); return Err(e); } } @@ -308,7 +302,9 @@ fn download_parallel( overall.finish_and_clear(); // Collect results, fail on first error - let results = results_mutex.into_inner().unwrap_or_else(|e| e.into_inner()); + let results = results_mutex + .into_inner() + .unwrap_or_else(|e| e.into_inner()); results.into_iter().collect() } @@ -374,6 +370,7 @@ pub fn install_single_package( /// Install a package from an already-downloaded archive. /// Extracts, patches ELF binaries, remaps scripts, places files, and registers in DB. /// Tracks all changes in the transaction for rollback support. +#[allow(clippy::too_many_arguments)] pub fn install_from_archive( candidate: &PackageCandidate, archive_path: &Path, @@ -421,7 +418,11 @@ pub fn install_from_archive( // Remap scripts for script_path in &extracted.script_files { if let Err(e) = remapper::remap_shebang(script_path, &mapping) { - tracing::warn!("Failed to remap shebang in {}: {}", script_path.display(), e); + tracing::warn!( + "Failed to remap shebang in {}: {}", + script_path.display(), + e + ); } if let Err(e) = remapper::remap_text_file(script_path, &mapping) { tracing::warn!("Failed to remap paths in {}: {}", script_path.display(), e); @@ -455,12 +456,11 @@ pub fn install_from_archive( installed_files.push(dest.clone()); // Track shared libraries - if analysis::is_elf_file(&dest) { - if let Ok(info) = analysis::analyze(&dest) { - if let Some(ref soname) = info.soname { - provides_libs.insert(soname.clone(), dest.clone()); - } - } + if analysis::is_elf_file(&dest) + && let Ok(info) = analysis::analyze(&dest) + && let Some(ref soname) = info.soname + { + provides_libs.insert(soname.clone(), dest.clone()); } } @@ -540,7 +540,7 @@ pub fn install_from_archive( for file in &installed_files { db.register_file(&file.to_string_lossy(), &pkg_key)?; } - for (soname, _) in &provides_libs { + for soname in provides_libs.keys() { db.register_lib(soname, &pkg_key)?; } @@ -651,10 +651,10 @@ fn remove_pkg_bin_symlinks(pkg_dir: &Path, bin_dir: &Path) -> ZlResult<()> { for entry in entries { let entry = entry?; let path = entry.path(); - if let Ok(target) = std::fs::read_link(&path) { - if target.starts_with(pkg_dir) { - std::fs::remove_file(&path)?; - } + if let Ok(target) = std::fs::read_link(&path) + && target.starts_with(pkg_dir) + { + std::fs::remove_file(&path)?; } } Ok(()) @@ -662,7 +662,7 @@ fn remove_pkg_bin_symlinks(pkg_dir: &Path, bin_dir: &Path) -> ZlResult<()> { /// Remove lib symlinks for a specific package version fn remove_pkg_lib_symlinks(node: &PackageNode, lib_dir: &Path) -> ZlResult<()> { - for (soname, _) in &node.provides_libs { + for soname in node.provides_libs.keys() { let link_path = lib_dir.join(soname); if link_path.symlink_metadata().is_ok() { std::fs::remove_file(&link_path)?; @@ -741,37 +741,42 @@ fn install_xdg_assets(pkg_dir: &Path, bin_dir: &Path, txn: &mut Transaction) { if !src_dir.is_dir() { continue; } - if let Ok(()) = std::fs::create_dir_all(&xdg_apps) { - if let Ok(entries) = std::fs::read_dir(&src_dir) { - for entry in entries.flatten() { - let path = entry.path(); - if path.extension().and_then(|e| e.to_str()) != Some("desktop") { + if let Ok(()) = std::fs::create_dir_all(&xdg_apps) + && let Ok(entries) = std::fs::read_dir(&src_dir) + { + for entry in entries.flatten() { + let path = entry.path(); + if path.extension().and_then(|e| e.to_str()) != Some("desktop") { + continue; + } + let dest = xdg_apps.join(entry.file_name()); + // Rewrite Exec= to strip absolute path prefixes + if let Ok(content) = std::fs::read_to_string(&path) { + let patched = patch_desktop_exec(&content, bin_dir); + if let Err(e) = std::fs::write(&dest, patched) { + tracing::warn!("Failed to write .desktop {}: {}", dest.display(), e); continue; } - let dest = xdg_apps.join(entry.file_name()); - // Rewrite Exec= to strip absolute path prefixes - if let Ok(content) = std::fs::read_to_string(&path) { - let patched = patch_desktop_exec(&content, bin_dir); - if let Err(e) = std::fs::write(&dest, patched) { - tracing::warn!("Failed to write .desktop {}: {}", dest.display(), e); - continue; - } - } else { - // Fallback: symlink as-is - if let Err(e) = unix_fs::symlink(&path, &dest) { - tracing::warn!("Failed to link .desktop {}: {}", dest.display(), e); - continue; - } + } else { + // Fallback: symlink as-is + if let Err(e) = unix_fs::symlink(&path, &dest) { + tracing::warn!("Failed to link .desktop {}: {}", dest.display(), e); + continue; } - txn.track_symlink(&dest); - tracing::debug!("Desktop entry: {}", dest.display()); } + txn.track_symlink(&dest); + tracing::debug!("Desktop entry: {}", dest.display()); } } } // Icons (preserve full subdirectory tree) - for subdir in &["usr/share/icons", "share/icons", "usr/share/pixmaps", "share/pixmaps"] { + for subdir in &[ + "usr/share/icons", + "share/icons", + "usr/share/pixmaps", + "share/pixmaps", + ] { let src_dir = pkg_dir.join(subdir); if !src_dir.is_dir() { continue; @@ -785,10 +790,10 @@ fn install_xdg_assets(pkg_dir: &Path, bin_dir: &Path, txn: &mut Transaction) { Err(_) => continue, }; let dest = xdg_icons.join(rel); - if let Some(parent) = dest.parent() { - if std::fs::create_dir_all(parent).is_err() { - continue; - } + if let Some(parent) = dest.parent() + && std::fs::create_dir_all(parent).is_err() + { + continue; } if dest.symlink_metadata().is_ok() { continue; // don't overwrite existing icon @@ -845,7 +850,11 @@ fn pick_source( for plugin in registry.all() { // Sync first so the local DB is up to date before querying if let Err(e) = plugin.sync() { - tracing::debug!("Failed to sync {} during source discovery: {}", plugin.name(), e); + tracing::debug!( + "Failed to sync {} during source discovery: {}", + plugin.name(), + e + ); } match plugin.resolve(package, version) { @@ -896,21 +905,20 @@ fn pick_source( _ => { let items: Vec<&str> = found.iter().map(|(_, label, _)| label.as_str()).collect(); - let selection = dialoguer::Select::with_theme( - &dialoguer::theme::ColorfulTheme::default(), - ) - .with_prompt(format!( - "Found '{}' in {} sources. Select one", - package, - found.len() - )) - .items(&items) - .default(0) - .interact() - .map_err(|e| ZlError::Plugin { - plugin: "interactive".into(), - message: format!("Selection cancelled: {}", e), - })?; + let selection = + dialoguer::Select::with_theme(&dialoguer::theme::ColorfulTheme::default()) + .with_prompt(format!( + "Found '{}' in {} sources. Select one", + package, + found.len() + )) + .items(&items) + .default(0) + .interact() + .map_err(|e| ZlError::Plugin { + plugin: "interactive".into(), + message: format!("Selection cancelled: {}", e), + })?; Ok(found[selection].0.clone()) } diff --git a/src/cli/list.rs b/src/cli/list.rs index d2f540a..fe9b3f0 100644 --- a/src/cli/list.rs +++ b/src/cli/list.rs @@ -40,8 +40,8 @@ pub fn handle(args: ListArgs, db: &ZlDatabase) -> ZlResult<()> { pinned_list.into_iter().map(|(name, _)| name).collect(); println!( - "{:<30} {:<20} {:<15} {:>6} {}", - "Name", "Version", "Source", "Files", "Status" + "{:<30} {:<20} {:<15} {:>6} Status", + "Name", "Version", "Source", "Files" ); println!("{}", "-".repeat(85)); diff --git a/src/cli/mod.rs b/src/cli/mod.rs index 150c27f..bd65bf2 100644 --- a/src/cli/mod.rs +++ b/src/cli/mod.rs @@ -16,6 +16,22 @@ pub mod upgrade; use clap::{ArgAction, Args, Parser, Subcommand}; use clap_complete::Shell; +use crate::core::db::ops::ZlDatabase; +use crate::paths::ZlPaths; +use crate::plugin::PluginRegistry; +use crate::system::SystemProfile; + +/// Shared application context passed to all command handlers. +pub struct AppContext<'a> { + pub paths: &'a ZlPaths, + pub db: &'a ZlDatabase, + pub registry: &'a PluginRegistry, + pub profile: &'a SystemProfile, + pub auto_yes: bool, + pub dry_run: bool, + pub skip_verify: bool, +} + #[derive(Parser)] #[command( name = "zl", diff --git a/src/cli/remove.rs b/src/cli/remove.rs index f34f708..5ad11ab 100644 --- a/src/cli/remove.rs +++ b/src/cli/remove.rs @@ -2,15 +2,13 @@ use crate::core::db::ops::ZlDatabase; use crate::error::{ZlError, ZlResult}; use crate::paths::ZlPaths; -use super::RemoveArgs; +use super::{AppContext, RemoveArgs}; -pub fn handle( - args: RemoveArgs, - paths: &ZlPaths, - db: &ZlDatabase, - auto_yes: bool, - dry_run: bool, -) -> ZlResult<()> { +pub fn handle(args: RemoveArgs, ctx: &AppContext) -> ZlResult<()> { + let paths = ctx.paths; + let db = ctx.db; + let auto_yes = ctx.auto_yes; + let dry_run = ctx.dry_run; // If a specific version was requested, remove only that version if let Some(ref version) = args.version { return remove_specific_version( @@ -91,7 +89,7 @@ pub fn handle( remove_bin_symlinks(&node.installed_files, &paths.bin)?; // 4. Remove lib symlinks - for (soname, _) in &node.provides_libs { + for soname in node.provides_libs.keys() { let link_path = paths.lib.join(soname); if link_path.symlink_metadata().is_ok() { std::fs::remove_file(&link_path)?; @@ -199,7 +197,7 @@ fn remove_single( remove_bin_symlinks(&node.installed_files, &paths.bin)?; - for (soname, _) in &node.provides_libs { + for soname in node.provides_libs.keys() { let link_path = paths.lib.join(soname); if link_path.symlink_metadata().is_ok() { std::fs::remove_file(&link_path)?; @@ -234,18 +232,18 @@ fn remove_bin_symlinks( for entry in entries { let entry = entry?; let path = entry.path(); - if path.symlink_metadata().is_ok() { - if let Ok(target) = std::fs::read_link(&path) { - // Check if symlink target belongs to this package - for installed in installed_files { - if target == *installed || target.starts_with(installed) { - std::fs::remove_file(&path)?; - tracing::debug!( - "Removed bin symlink: {}", - entry.file_name().to_string_lossy() - ); - break; - } + if path.symlink_metadata().is_ok() + && let Ok(target) = std::fs::read_link(&path) + { + // Check if symlink target belongs to this package + for installed in installed_files { + if target == *installed || target.starts_with(installed) { + std::fs::remove_file(&path)?; + tracing::debug!( + "Removed bin symlink: {}", + entry.file_name().to_string_lossy() + ); + break; } } } @@ -307,7 +305,7 @@ fn remove_orphans(paths: &ZlPaths, db: &ZlDatabase) -> ZlResult<()> { println!(" - {}-{}", orphan.id.name, orphan.id.version); // Remove lib symlinks - for (soname, _) in &orphan.provides_libs { + for soname in orphan.provides_libs.keys() { let link_path = paths.lib.join(soname); if link_path.symlink_metadata().is_ok() { std::fs::remove_file(&link_path)?; diff --git a/src/cli/selfupdate.rs b/src/cli/selfupdate.rs index 4bba538..112abce 100644 --- a/src/cli/selfupdate.rs +++ b/src/cli/selfupdate.rs @@ -70,7 +70,8 @@ pub fn handle() -> ZlResult<()> { if !response.status().is_success() { let msg = if response.status().as_u16() == 404 { - "No releases found on GitHub — check that the repository has published releases".to_string() + "No releases found on GitHub — check that the repository has published releases" + .to_string() } else { format!( "GitHub API returned status {}: check your internet connection or try again later", diff --git a/src/cli/update.rs b/src/cli/update.rs index 2a768ae..fc62733 100644 --- a/src/cli/update.rs +++ b/src/cli/update.rs @@ -1,34 +1,22 @@ -use crate::core::db::ops::ZlDatabase; use crate::error::{ZlError, ZlResult}; -use crate::paths::ZlPaths; -use crate::plugin::PluginRegistry; -use crate::system::SystemProfile; - -use super::{RemoveArgs, UpdateArgs}; - -pub fn handle( - args: UpdateArgs, - paths: &ZlPaths, - db: &ZlDatabase, - registry: &PluginRegistry, - profile: &SystemProfile, - _auto_yes: bool, - dry_run: bool, - skip_verify: bool, -) -> ZlResult<()> { - if dry_run { + +use super::{AppContext, RemoveArgs, UpdateArgs}; + +pub fn handle(args: UpdateArgs, ctx: &AppContext) -> ZlResult<()> { + if ctx.dry_run { println!("[DRY-RUN] Simulating update..."); } // Get list of packages to update let packages = match args.package { Some(ref name) => { - let pkg = db + let pkg = ctx + .db .get_package_by_name(name)? .ok_or_else(|| ZlError::PackageNotFound { name: name.clone() })?; vec![pkg] } - None => db.list_packages()?, + None => ctx.db.list_packages()?, }; if packages.is_empty() { @@ -37,7 +25,7 @@ pub fn handle( } // Sync all plugins first - for plugin in registry.all() { + for plugin in ctx.registry.all() { if let Err(e) = plugin.sync() { tracing::warn!("Failed to sync {}: {}", plugin.name(), e); } @@ -53,7 +41,7 @@ pub fn handle( } // Skip pinned packages - if db.is_pinned(&pkg.id.name)? { + if ctx.db.is_pinned(&pkg.id.name)? { tracing::info!("{} is pinned, skipping update", pkg.id.name); skipped_pinned += 1; continue; @@ -61,7 +49,7 @@ pub fn handle( // Find the plugin that manages this package let source_name = pkg.id.source.split('/').next().unwrap_or(&pkg.id.source); - let plugin = match registry.get(source_name) { + let plugin = match ctx.registry.get(source_name) { Some(p) => p, None => { tracing::warn!("No plugin found for source '{}', skipping", pkg.id.source); @@ -77,7 +65,7 @@ pub fn handle( pkg.id.name, pkg.id.version, candidate.version ); - if dry_run { + if ctx.dry_run { updated += 1; continue; } @@ -88,17 +76,17 @@ pub fn handle( cascade: false, version: Some(pkg.id.version.clone()), }; - super::remove::handle(remove_args, paths, db, true, false)?; + super::remove::handle(remove_args, ctx)?; // Install new version directly (skip dep resolution for updates) super::install::install_single_package( &candidate, true, // maintain explicit status - paths, - db, + ctx.paths, + ctx.db, plugin, - profile, - skip_verify, + ctx.profile, + ctx.skip_verify, )?; updated += 1; @@ -112,7 +100,7 @@ pub fn handle( } } - if dry_run { + if ctx.dry_run { if updated == 0 { println!("[DRY-RUN] All packages are up to date."); } else { diff --git a/src/cli/upgrade.rs b/src/cli/upgrade.rs index 8e01e62..116c29c 100644 --- a/src/cli/upgrade.rs +++ b/src/cli/upgrade.rs @@ -1,10 +1,7 @@ -use crate::core::db::ops::ZlDatabase; use crate::error::ZlResult; -use crate::paths::ZlPaths; -use crate::plugin::{PackageCandidate, PluginRegistry}; -use crate::system::SystemProfile; +use crate::plugin::PackageCandidate; -use super::{RemoveArgs, UpgradeArgs}; +use super::{AppContext, RemoveArgs, UpgradeArgs}; /// An available upgrade for a single package struct UpgradeEntry { @@ -15,17 +12,8 @@ struct UpgradeEntry { source_name: String, } -pub fn handle( - args: UpgradeArgs, - paths: &ZlPaths, - db: &ZlDatabase, - registry: &PluginRegistry, - profile: &SystemProfile, - _auto_yes: bool, - dry_run: bool, - skip_verify: bool, -) -> ZlResult<()> { - let packages = db.list_packages()?; +pub fn handle(args: UpgradeArgs, ctx: &AppContext) -> ZlResult<()> { + let packages = ctx.db.list_packages()?; if packages.is_empty() { println!("No packages installed."); @@ -34,11 +22,11 @@ pub fn handle( // Sync all relevant plugins println!("Syncing package databases..."); - for plugin in registry.all() { - if let Some(ref source_filter) = args.from { - if plugin.name() != source_filter { - continue; - } + for plugin in ctx.registry.all() { + if let Some(ref source_filter) = args.from + && plugin.name() != source_filter + { + continue; } if let Err(e) = plugin.sync() { tracing::warn!("Failed to sync {}: {}", plugin.name(), e); @@ -55,7 +43,7 @@ pub fn handle( continue; } - if db.is_pinned(&pkg.id.name)? { + if ctx.db.is_pinned(&pkg.id.name)? { skipped_pinned += 1; continue; } @@ -63,13 +51,13 @@ pub fn handle( let source_name = pkg.id.source.split('/').next().unwrap_or(&pkg.id.source); // If --from filter is set, skip packages from other sources - if let Some(ref source_filter) = args.from { - if source_name != source_filter { - continue; - } + if let Some(ref source_filter) = args.from + && source_name != source_filter + { + continue; } - let plugin = match registry.get(source_name) { + let plugin = match ctx.registry.get(source_name) { Some(p) => p, None => continue, }; @@ -127,7 +115,7 @@ pub fn handle( return Ok(()); } - if dry_run { + if ctx.dry_run { println!( "\n[DRY-RUN] Would upgrade {} package(s). No changes made.", upgrades.len() @@ -157,7 +145,7 @@ pub fn handle( entry.name, entry.old_version, entry.new_version ); - let plugin = match registry.get(&entry.source_name) { + let plugin = match ctx.registry.get(&entry.source_name) { Some(p) => p, None => { eprintln!(" Plugin {} not found, skipping", entry.source_name); @@ -172,7 +160,7 @@ pub fn handle( cascade: false, version: Some(entry.old_version.clone()), }; - if let Err(e) = super::remove::handle(remove_args, paths, db, true, false) { + if let Err(e) = super::remove::handle(remove_args, ctx) { eprintln!(" Failed to remove old version: {}", e); failed += 1; continue; @@ -182,11 +170,11 @@ pub fn handle( match super::install::install_single_package( &entry.candidate, true, - paths, - db, + ctx.paths, + ctx.db, plugin, - profile, - skip_verify, + ctx.profile, + ctx.skip_verify, ) { Ok(()) => { upgraded += 1; diff --git a/src/config.rs b/src/config.rs index 9a7b2bb..80685c3 100644 --- a/src/config.rs +++ b/src/config.rs @@ -47,6 +47,7 @@ pub struct GeneralConfig { pub struct PluginConfig { /// Whether this plugin is enabled #[serde(default = "default_true")] + #[allow(dead_code)] pub enabled: bool, /// Cache directory for this plugin (set at runtime) #[serde(skip)] @@ -73,6 +74,7 @@ impl ZlConfig { } /// Load config from a specific path + #[allow(dead_code)] pub fn load_from(path: &Path) -> ZlResult { let content = std::fs::read_to_string(path)?; toml::from_str(&content).map_err(|e| crate::error::ZlError::Config(e.to_string())) diff --git a/src/core/build/mod.rs b/src/core/build/mod.rs index c0bb552..58bba0f 100644 --- a/src/core/build/mod.rs +++ b/src/core/build/mod.rs @@ -1,3 +1,5 @@ +#![allow(dead_code)] + pub mod systems; use std::path::{Path, PathBuf}; diff --git a/src/core/build/systems.rs b/src/core/build/systems.rs index 8c1c354..ea85a84 100644 --- a/src/core/build/systems.rs +++ b/src/core/build/systems.rs @@ -1,3 +1,5 @@ +#![allow(dead_code)] + use std::path::Path; use std::process::Command; diff --git a/src/core/conflicts.rs b/src/core/conflicts.rs index 55b3fa7..44ce625 100644 --- a/src/core/conflicts.rs +++ b/src/core/conflicts.rs @@ -30,12 +30,12 @@ pub enum Conflict { new_package: String, }, /// The candidate's `conflicts` field lists an installed package. - DeclaredConflict { + Declared { package: String, conflicts_with: String, }, /// A dependency version constraint cannot be satisfied by the installed version. - VersionConflict { + Version { dependency: String, required_by: String, required_version: String, @@ -73,14 +73,14 @@ impl fmt::Display for Conflict { "library conflict: '{soname}' is provided by '{existing_package}', \ also provided by '{new_package}'" ), - Conflict::DeclaredConflict { + Conflict::Declared { package, conflicts_with, } => write!( f, "declared conflict: '{package}' conflicts with installed '{conflicts_with}'" ), - Conflict::VersionConflict { + Conflict::Version { dependency, required_by, required_version, @@ -159,7 +159,7 @@ pub fn check_conflicts( // Check if the candidate declares a conflict with this installed package for conflict_pattern in &candidate.conflicts { if matches_package_name(conflict_pattern, installed_name) { - conflicts.push(Conflict::DeclaredConflict { + conflicts.push(Conflict::Declared { package: new_key.clone(), conflicts_with: format!( "{}-{}", @@ -177,15 +177,15 @@ pub fn check_conflicts( let (dep_name, constraint) = parse_dependency_spec(dep_spec); if let Some(constraint) = constraint { // Look up installed version of this dependency - if let Some(installed_dep) = db.get_package_by_name(dep_name)? { - if !satisfies_constraint(&installed_dep.id.version, constraint) { - conflicts.push(Conflict::VersionConflict { - dependency: dep_name.to_string(), - required_by: new_key.clone(), - required_version: constraint.to_string(), - installed_version: installed_dep.id.version.clone(), - }); - } + if let Some(installed_dep) = db.get_package_by_name(dep_name)? + && !satisfies_constraint(&installed_dep.id.version, constraint) + { + conflicts.push(Conflict::Version { + dependency: dep_name.to_string(), + required_by: new_key.clone(), + required_version: constraint.to_string(), + installed_version: installed_dep.id.version.clone(), + }); } } } @@ -228,14 +228,14 @@ fn scan_dir_for_ownership_conflicts( for entry in walker { if entry.file_type().is_file() || entry.file_type().is_symlink() { let path_str = entry.path().to_string_lossy().to_string(); - if let Some(owner) = db.file_owner(&path_str)? { - if owner != new_key { - conflicts.push(Conflict::FileOwnership { - path: path_str, - existing_owner: owner, - new_package: new_key.to_string(), - }); - } + if let Some(owner) = db.file_owner(&path_str)? + && owner != new_key + { + conflicts.push(Conflict::FileOwnership { + path: path_str, + existing_owner: owner, + new_package: new_key.to_string(), + }); } } } @@ -260,14 +260,14 @@ fn check_binary_conflicts( if let Ok(target) = std::fs::read_link(&bin_path) { let target_str = target.to_string_lossy(); // Package dirs are named "name-version" under packages/ - if let Some(owner) = extract_package_key_from_path(&target_str) { - if owner != new_key { - conflicts.push(Conflict::BinaryName { - name: candidate.name.clone(), - existing_package: owner, - new_package: new_key.to_string(), - }); - } + if let Some(owner) = extract_package_key_from_path(&target_str) + && owner != new_key + { + conflicts.push(Conflict::BinaryName { + name: candidate.name.clone(), + existing_package: owner, + new_package: new_key.to_string(), + }); } } else { // Not a symlink but a regular file — still a conflict @@ -282,18 +282,18 @@ fn check_binary_conflicts( // Also check the `provides` list: each provided name gets a symlink in bin/ for provided in &candidate.provides { let provided_bin = paths.bin.join(provided); - if provided_bin.symlink_metadata().is_ok() { - if let Ok(target) = std::fs::read_link(&provided_bin) { - let target_str = target.to_string_lossy(); - if let Some(owner) = extract_package_key_from_path(&target_str) { - if owner != new_key { - conflicts.push(Conflict::BinaryName { - name: provided.clone(), - existing_package: owner, - new_package: new_key.to_string(), - }); - } - } + if provided_bin.symlink_metadata().is_ok() + && let Ok(target) = std::fs::read_link(&provided_bin) + { + let target_str = target.to_string_lossy(); + if let Some(owner) = extract_package_key_from_path(&target_str) + && owner != new_key + { + conflicts.push(Conflict::BinaryName { + name: provided.clone(), + existing_package: owner, + new_package: new_key.to_string(), + }); } } } @@ -312,16 +312,15 @@ fn check_library_conflicts( // Check if any soname provided by the candidate is already provided by another package for provided in &candidate.provides { // Library provides are typically sonames like "libfoo.so.3" - if provided.contains(".so") { - if let Some(existing_provider) = db.lib_provider(provided)? { - if existing_provider != new_key { - conflicts.push(Conflict::LibrarySoname { - soname: provided.clone(), - existing_package: existing_provider, - new_package: new_key.to_string(), - }); - } - } + if provided.contains(".so") + && let Some(existing_provider) = db.lib_provider(provided)? + && existing_provider != new_key + { + conflicts.push(Conflict::LibrarySoname { + soname: provided.clone(), + existing_package: existing_provider, + new_package: new_key.to_string(), + }); } } Ok(()) @@ -376,16 +375,16 @@ pub fn satisfies_constraint(version: &str, constraint: &str) -> bool { return true; } - let (op, req_version) = if constraint.starts_with(">=") { - (">=", constraint[2..].trim()) - } else if constraint.starts_with("<=") { - ("<=", constraint[2..].trim()) - } else if constraint.starts_with('>') { - (">", constraint[1..].trim()) - } else if constraint.starts_with('<') { - ("<", constraint[1..].trim()) - } else if constraint.starts_with('=') { - ("=", constraint[1..].trim()) + let (op, req_version) = if let Some(rest) = constraint.strip_prefix(">=") { + (">=", rest.trim()) + } else if let Some(rest) = constraint.strip_prefix("<=") { + ("<=", rest.trim()) + } else if let Some(rest) = constraint.strip_prefix('>') { + (">", rest.trim()) + } else if let Some(rest) = constraint.strip_prefix('<') { + ("<", rest.trim()) + } else if let Some(rest) = constraint.strip_prefix('=') { + ("=", rest.trim()) } else { // No recognised operator — treat as exact match ("=", constraint) @@ -438,10 +437,10 @@ fn compare_versions(a: &str, b: &str) -> std::cmp::Ordering { fn extract_package_key_from_path(path: &str) -> Option { let parts: Vec<&str> = path.split('/').collect(); for (i, part) in parts.iter().enumerate() { - if *part == "packages" { - if let Some(key) = parts.get(i + 1) { - return Some(key.to_string()); - } + if *part == "packages" + && let Some(key) = parts.get(i + 1) + { + return Some(key.to_string()); } } None @@ -645,7 +644,7 @@ mod tests { assert!(report.has_conflicts()); assert_eq!(report.conflicts.len(), 1); match &report.conflicts[0] { - Conflict::DeclaredConflict { + Conflict::Declared { package, conflicts_with, } => { @@ -684,7 +683,7 @@ mod tests { assert!(report.has_conflicts()); assert_eq!(report.conflicts.len(), 1); match &report.conflicts[0] { - Conflict::VersionConflict { + Conflict::Version { dependency, required_by, required_version, @@ -774,7 +773,7 @@ mod tests { #[test] fn test_conflict_report_display() { let report = ConflictReport { - conflicts: vec![Conflict::DeclaredConflict { + conflicts: vec![Conflict::Declared { package: "a-1.0".into(), conflicts_with: "b-2.0".into(), }], diff --git a/src/core/db/ops.rs b/src/core/db/ops.rs index 948c01e..be81146 100644 --- a/src/core/db/ops.rs +++ b/src/core/db/ops.rs @@ -27,8 +27,9 @@ impl ZlDatabase { .map_err(|e| ZlError::Config(format!("Failed to init FILE_OWNERS table: {}", e)))?; txn.open_table(LIB_INDEX) .map_err(|e| ZlError::Config(format!("Failed to init LIB_INDEX table: {}", e)))?; - txn.open_table(DEPENDENCIES) - .map_err(|e| ZlError::Config(format!("Failed to init DEPENDENCIES table: {}", e)))?; + txn.open_table(DEPENDENCIES).map_err(|e| { + ZlError::Config(format!("Failed to init DEPENDENCIES table: {}", e)) + })?; txn.open_table(PLUGIN_META) .map_err(|e| ZlError::Config(format!("Failed to init PLUGIN_META table: {}", e)))?; txn.open_table(PINNED) @@ -407,6 +408,7 @@ impl ZlDatabase { // ── Plugin metadata ── /// Store arbitrary plugin metadata (e.g. last sync timestamp) + #[allow(dead_code)] pub fn put_plugin_meta(&self, plugin_name: &str, data: &[u8]) -> ZlResult<()> { let txn = self .db diff --git a/src/core/elf/analysis.rs b/src/core/elf/analysis.rs index 5e22202..38cfeab 100644 --- a/src/core/elf/analysis.rs +++ b/src/core/elf/analysis.rs @@ -8,20 +8,25 @@ pub struct ElfInfo { /// Path to the ELF file pub path: PathBuf, /// Whether this is a dynamically linked executable, shared library, or static + #[allow(dead_code)] pub elf_type: ElfType, /// The PT_INTERP path (e.g., /lib64/ld-linux-x86-64.so.2) pub interpreter: Option, /// DT_NEEDED entries — shared libraries this binary needs pub needed_libs: Vec, /// Current RPATH (DT_RPATH) + #[allow(dead_code)] pub rpath: Option, /// Current RUNPATH (DT_RUNPATH) + #[allow(dead_code)] pub runpath: Option, /// SONAME if this is a shared library pub soname: Option, /// Architecture (e.g., EM_X86_64) + #[allow(dead_code)] pub machine: u16, /// Whether the binary is 64-bit + #[allow(dead_code)] pub is_64bit: bool, } @@ -47,7 +52,7 @@ pub fn analyze(path: &Path) -> ZlResult { let data = std::fs::read(path)?; let elf = Elf::parse(&data).map_err(|e| crate::error::ZlError::ElfAnalysis { path: path.to_path_buf(), - source: e, + source: Box::new(e), })?; let elf_type = match elf.header.e_type { @@ -115,3 +120,55 @@ pub fn scan_directory(dir: &Path) -> ZlResult> { } Ok(results) } + +#[cfg(test)] +mod tests { + use super::*; + + #[test] + fn test_is_elf_file_on_bin_sh() { + assert!(is_elf_file(Path::new("/bin/sh"))); + } + + #[test] + fn test_is_elf_file_on_non_elf() { + let dir = tempfile::tempdir().unwrap(); + let file = dir.path().join("notelf.txt"); + std::fs::write(&file, "hello world").unwrap(); + assert!(!is_elf_file(&file)); + } + + #[test] + fn test_is_elf_file_on_nonexistent() { + assert!(!is_elf_file(Path::new("/nonexistent/file"))); + } + + #[test] + fn test_analyze_bin_sh() { + let info = analyze(Path::new("/bin/sh")).unwrap(); + assert_eq!(info.path, PathBuf::from("/bin/sh")); + // /bin/sh is typically a PIE executable (ET_DYN with PT_INTERP) + assert!( + info.elf_type == ElfType::Executable, + "Expected executable, got: {:?}", + info.elf_type + ); + assert!( + info.interpreter.is_some(), + "/bin/sh should have a PT_INTERP" + ); + assert!( + !info.needed_libs.is_empty(), + "/bin/sh should have DT_NEEDED entries" + ); + } + + #[test] + fn test_scan_directory_finds_elfs() { + let results = scan_directory(Path::new("/bin")).unwrap(); + assert!( + !results.is_empty(), + "/bin should contain at least one ELF file" + ); + } +} diff --git a/src/core/elf/patcher.rs b/src/core/elf/patcher.rs index 10116a9..c955b9d 100644 --- a/src/core/elf/patcher.rs +++ b/src/core/elf/patcher.rs @@ -23,21 +23,20 @@ pub fn patch_for_zl( let mut patcher = elb::ElfPatcher::new(elf, file); // Patch interpreter if present and needs remapping - if let Some(ref orig_interp) = info.interpreter { - if let Some(new_interp) = mapping.remap_interpreter(orig_interp) { - let c_interp = std::ffi::CString::new(new_interp).map_err(|e| { - crate::error::ZlError::ElfPatch { - path: path.to_path_buf(), - message: e.to_string(), - } + if let Some(ref orig_interp) = info.interpreter + && let Some(new_interp) = mapping.remap_interpreter(orig_interp) + { + let c_interp = + std::ffi::CString::new(new_interp).map_err(|e| crate::error::ZlError::ElfPatch { + path: path.to_path_buf(), + message: e.to_string(), + })?; + patcher + .set_interpreter(&c_interp) + .map_err(|e| crate::error::ZlError::ElfPatch { + path: path.to_path_buf(), + message: e.to_string(), })?; - patcher - .set_interpreter(&c_interp) - .map_err(|e| crate::error::ZlError::ElfPatch { - path: path.to_path_buf(), - message: e.to_string(), - })?; - } } // Build and set RUNPATH so needed libs resolve correctly @@ -66,6 +65,7 @@ pub fn patch_for_zl( } /// Set only the interpreter of an ELF binary +#[allow(dead_code)] pub fn set_interpreter(path: &Path, new_interp: &str, page_size: u64) -> ZlResult<()> { use std::fs::OpenOptions; @@ -99,6 +99,7 @@ pub fn set_interpreter(path: &Path, new_interp: &str, page_size: u64) -> ZlResul } /// Set only the RUNPATH of an ELF binary +#[allow(dead_code)] pub fn set_runpath(path: &Path, new_runpath: &str, page_size: u64) -> ZlResult<()> { use elb::DynamicTag; use std::fs::OpenOptions; diff --git a/src/core/graph/model.rs b/src/core/graph/model.rs index 48df5a5..4f81eb8 100644 --- a/src/core/graph/model.rs +++ b/src/core/graph/model.rs @@ -19,28 +19,30 @@ pub struct PackageNode { pub explicit: bool, } +#[allow(dead_code)] #[derive(Debug, Clone, serde::Serialize, serde::Deserialize)] pub struct DependencyEdge { pub dep_type: DepType, pub version_constraint: Option, } +#[allow(dead_code)] #[derive(Debug, Clone, serde::Serialize, serde::Deserialize)] pub enum DepType { Declared, SharedLibrary { lib_name: String }, } +#[derive(Default)] +#[allow(dead_code)] pub struct DepGraph { pub graph: DiGraph, pub index: HashMap, } +#[allow(dead_code)] impl DepGraph { pub fn new() -> Self { - Self { - graph: DiGraph::new(), - index: HashMap::new(), - } + Self::default() } } diff --git a/src/core/graph/resolver.rs b/src/core/graph/resolver.rs index 3483909..2dae62a 100644 --- a/src/core/graph/resolver.rs +++ b/src/core/graph/resolver.rs @@ -6,6 +6,7 @@ use std::collections::HashSet; use super::model::{DepGraph, DepType, DependencyEdge, PackageNode}; use crate::error::{ZlError, ZlResult}; +#[allow(dead_code)] impl DepGraph { /// Add a package to the graph and return its node index pub fn add_package(&mut self, node: PackageNode) -> NodeIndex { @@ -71,10 +72,10 @@ impl DepGraph { pub fn topological_order(&self) -> ZlResult> { toposort(&self.graph, None).map_err(|cycle| { let node = &self.graph[cycle.node_id()]; - ZlError::DependencyResolution { - package: format!("{}-{}", node.id.name, node.id.version), - message: "Dependency cycle detected".into(), - } + ZlError::Config(format!( + "Dependency cycle detected involving {}-{}", + node.id.name, node.id.version + )) }) } @@ -190,7 +191,7 @@ mod tests { let mut graph = DepGraph::new(); let _a = graph.add_package(make_node("a", "1.0", true)); let b = graph.add_package(make_node("b", "1.0", false)); - let c = graph.add_package(make_node("c", "1.0", false)); + let _c = graph.add_package(make_node("c", "1.0", false)); graph.add_dependency(_a, b, None); // c is a dependency-type package but nothing depends on it => orphan diff --git a/src/core/graph/verifier.rs b/src/core/graph/verifier.rs index 51d3f9f..5769007 100644 --- a/src/core/graph/verifier.rs +++ b/src/core/graph/verifier.rs @@ -109,6 +109,7 @@ pub fn format_report(verification: &PackageVerification) -> String { } /// Verify and return an error if anything is broken +#[allow(dead_code)] pub fn verify_or_fail( package_dir: &Path, package_name: &str, diff --git a/src/core/path/mod.rs b/src/core/path/mod.rs index 7e5cb40..811bd8f 100644 --- a/src/core/path/mod.rs +++ b/src/core/path/mod.rs @@ -9,12 +9,15 @@ use crate::system::SystemProfile; #[derive(Debug, Clone)] pub struct PathMapping { /// ZL root directory + #[allow(dead_code)] pub zl_root: PathBuf, /// Package-specific install prefix + #[allow(dead_code)] pub pkg_prefix: PathBuf, /// Shared library directory pub shared_lib_dir: PathBuf, /// Shared binary directory + #[allow(dead_code)] pub shared_bin_dir: PathBuf, /// The system's actual ld-linux path pub system_interpreter: String, @@ -82,10 +85,7 @@ impl PathMapping { /// Compute the RUNPATH string for an ELF binary pub fn compute_runpath(&self, _binary_path: &Path, _needed_libs: &[String]) -> Option { - let mut paths = Vec::new(); - paths.push("$ORIGIN".to_string()); - paths.push(self.shared_lib_dir.to_string_lossy().to_string()); - Some(paths.join(":")) + Some(format!("$ORIGIN:{}", self.shared_lib_dir.to_string_lossy())) } /// Remap an arbitrary FHS path to its ZL equivalent @@ -101,3 +101,99 @@ impl PathMapping { original.to_string() } } + +#[cfg(test)] +mod tests { + use super::*; + + fn test_profile() -> SystemProfile { + SystemProfile::detect() + } + + #[test] + fn test_path_mapping_for_package() { + let profile = test_profile(); + let zl_root = Path::new("/tmp/test-zl"); + let mapping = PathMapping::for_package(zl_root, "firefox", "120.0", &profile); + + assert_eq!(mapping.zl_root, PathBuf::from("/tmp/test-zl")); + assert_eq!( + mapping.pkg_prefix, + PathBuf::from("/tmp/test-zl/packages/firefox-120.0") + ); + assert_eq!(mapping.shared_lib_dir, PathBuf::from("/tmp/test-zl/lib")); + assert_eq!(mapping.shared_bin_dir, PathBuf::from("/tmp/test-zl/bin")); + assert!(!mapping.prefix_map.is_empty()); + } + + #[test] + fn test_remap_path_usr_lib() { + let profile = test_profile(); + let zl_root = Path::new("/tmp/test-zl"); + let mapping = PathMapping::for_package(zl_root, "test", "1.0", &profile); + + let remapped = mapping.remap_path("/usr/lib/libfoo.so"); + assert!( + remapped.starts_with("/tmp/test-zl/lib"), + "Expected /usr/lib to remap to ZL lib dir, got: {}", + remapped + ); + } + + #[test] + fn test_remap_path_usr_bin() { + let profile = test_profile(); + let zl_root = Path::new("/tmp/test-zl"); + let mapping = PathMapping::for_package(zl_root, "test", "1.0", &profile); + + let remapped = mapping.remap_path("/usr/bin/myprog"); + assert!( + remapped.starts_with("/tmp/test-zl/bin"), + "Expected /usr/bin to remap to ZL bin dir, got: {}", + remapped + ); + } + + #[test] + fn test_remap_path_unknown_stays_unchanged() { + let profile = test_profile(); + let zl_root = Path::new("/tmp/test-zl"); + let mapping = PathMapping::for_package(zl_root, "test", "1.0", &profile); + + assert_eq!(mapping.remap_path("/opt/custom/path"), "/opt/custom/path"); + } + + #[test] + fn test_remap_interpreter_nonexistent() { + let profile = test_profile(); + let zl_root = Path::new("/tmp/test-zl"); + let mapping = PathMapping::for_package(zl_root, "test", "1.0", &profile); + + let result = mapping.remap_interpreter("/nonexistent/ld-linux.so.2"); + assert!(result.is_some()); + assert_eq!(result.unwrap(), profile.interpreter_str()); + } + + #[test] + fn test_remap_interpreter_existing() { + let profile = test_profile(); + let zl_root = Path::new("/tmp/test-zl"); + let mapping = PathMapping::for_package(zl_root, "test", "1.0", &profile); + + let result = mapping.remap_interpreter(&profile.interpreter_str()); + assert!(result.is_none()); + } + + #[test] + fn test_compute_runpath() { + let profile = test_profile(); + let zl_root = Path::new("/tmp/test-zl"); + let mapping = PathMapping::for_package(zl_root, "test", "1.0", &profile); + + let runpath = mapping + .compute_runpath(Path::new("/tmp/binary"), &[]) + .unwrap(); + assert!(runpath.contains("$ORIGIN")); + assert!(runpath.contains("/tmp/test-zl/lib")); + } +} diff --git a/src/core/path/remapper.rs b/src/core/path/remapper.rs index b805c33..73708e0 100644 --- a/src/core/path/remapper.rs +++ b/src/core/path/remapper.rs @@ -31,10 +31,12 @@ pub fn remap_shebang(path: &Path, mapping: &super::PathMapping) -> ZlResult ZlResult PathMapping { + let profile = SystemProfile::detect(); + PathMapping::for_package( + std::path::Path::new("/tmp/test-zl"), + "test", + "1.0", + &profile, + ) + } + + #[test] + fn test_remap_text_file_replaces_paths() { + let dir = tempfile::tempdir().unwrap(); + let file_path = dir.path().join("test.pc"); + std::fs::write(&file_path, "prefix=/usr/lib\nlibdir=/usr/lib/pkgconfig\n").unwrap(); + + let mapping = test_mapping(); + let changed = remap_text_file(&file_path, &mapping).unwrap(); + assert!(changed); + + let content = std::fs::read_to_string(&file_path).unwrap(); + assert!( + content.contains("/tmp/test-zl/lib"), + "Expected remapped path, got: {}", + content + ); + assert!(!content.contains("/usr/lib")); + } + + #[test] + fn test_remap_text_file_no_change() { + let dir = tempfile::tempdir().unwrap(); + let file_path = dir.path().join("plain.txt"); + std::fs::write(&file_path, "nothing to remap here\n").unwrap(); + + let mapping = test_mapping(); + let changed = remap_text_file(&file_path, &mapping).unwrap(); + assert!(!changed); + } + + #[test] + fn test_remap_shebang_replaces() { + let dir = tempfile::tempdir().unwrap(); + let file_path = dir.path().join("script.sh"); + std::fs::write(&file_path, "#!/usr/bin/env bash\necho hello\n").unwrap(); + + let mapping = test_mapping(); + let changed = remap_shebang(&file_path, &mapping).unwrap(); + assert!(changed); + + let content = std::fs::read_to_string(&file_path).unwrap(); + let first_line = content.lines().next().unwrap_or(""); + // /usr/bin should be remapped to ZL bin dir + assert!( + !first_line.contains("/usr/bin"), + "Expected /usr/bin to be remapped, got: {}", + first_line + ); + assert!(content.contains("echo hello")); + } + + #[test] + fn test_remap_shebang_no_shebang() { + let dir = tempfile::tempdir().unwrap(); + let file_path = dir.path().join("noshebang.txt"); + std::fs::write(&file_path, "just a file\n").unwrap(); + + let mapping = test_mapping(); + let changed = remap_shebang(&file_path, &mapping).unwrap(); + assert!(!changed); + } +} diff --git a/src/core/transaction.rs b/src/core/transaction.rs index ec89592..7668e02 100644 --- a/src/core/transaction.rs +++ b/src/core/transaction.rs @@ -11,6 +11,7 @@ use crate::core::db::ops::ZlDatabase; /// If a `Transaction` is dropped without calling `commit()`, the `Drop` impl /// will log a warning. It does **not** auto-rollback because it doesn't have a /// reference to the database — call `rollback()` explicitly when you have one. +#[derive(Default)] pub struct Transaction { created_files: Vec, created_dirs: Vec, @@ -22,13 +23,7 @@ pub struct Transaction { impl Transaction { /// Create a new, empty transaction. pub fn new() -> Self { - Self { - created_files: Vec::new(), - created_dirs: Vec::new(), - created_symlinks: Vec::new(), - db_package_keys: Vec::new(), - committed: false, - } + Self::default() } // ── Tracking methods ── @@ -91,19 +86,19 @@ impl Transaction { // 2. Remove symlinks in reverse for path in self.created_symlinks.iter().rev() { - if path.symlink_metadata().is_ok() { - if let Err(e) = std::fs::remove_file(path) { - error!("rollback: failed to remove symlink {}: {e}", path.display()); - } + if path.symlink_metadata().is_ok() + && let Err(e) = std::fs::remove_file(path) + { + error!("rollback: failed to remove symlink {}: {e}", path.display()); } } // 3. Remove files in reverse for path in self.created_files.iter().rev() { - if path.exists() { - if let Err(e) = std::fs::remove_file(path) { - error!("rollback: failed to remove file {}: {e}", path.display()); - } + if path.exists() + && let Err(e) = std::fs::remove_file(path) + { + error!("rollback: failed to remove file {}: {e}", path.display()); } } @@ -123,22 +118,27 @@ impl Transaction { // ── Accessors (useful for tests and diagnostics) ── + #[cfg(test)] pub fn created_files(&self) -> &[PathBuf] { &self.created_files } + #[cfg(test)] pub fn created_dirs(&self) -> &[PathBuf] { &self.created_dirs } + #[cfg(test)] pub fn created_symlinks(&self) -> &[PathBuf] { &self.created_symlinks } + #[cfg(test)] pub fn db_package_keys(&self) -> &[String] { &self.db_package_keys } + #[cfg(test)] pub fn is_committed(&self) -> bool { self.committed } diff --git a/src/core/verify.rs b/src/core/verify.rs index 8ebf335..471e33f 100644 --- a/src/core/verify.rs +++ b/src/core/verify.rs @@ -8,14 +8,17 @@ use crate::error::{ZlError, ZlResult}; #[derive(Debug)] pub struct VerifyResult { /// SHA256 checksum matched + #[allow(dead_code)] pub checksum_ok: bool, /// GPG signature verification result (None if no signature available) + #[allow(dead_code)] pub gpg_ok: Option, /// Human-readable verification message pub message: String, } impl VerifyResult { + #[allow(dead_code)] pub fn passed(&self) -> bool { self.checksum_ok && self.gpg_ok.unwrap_or(true) } diff --git a/src/error.rs b/src/error.rs index 4b0337c..c88716f 100644 --- a/src/error.rs +++ b/src/error.rs @@ -7,56 +7,23 @@ pub enum ZlError { #[error("ELF analysis failed for {path}: {source}")] ElfAnalysis { path: PathBuf, - source: goblin::error::Error, + source: Box, }, #[error("ELF patching failed for {path}: {message}")] ElfPatch { path: PathBuf, message: String }, - // ── Path remapping ── - #[error("Path remapping failed: {0}")] - PathRemap(String), - // ── Package resolution ── #[error("Package not found: {name}\n hint: try `zl search {name}` to find available packages")] PackageNotFound { name: String }, - #[error("Package already installed: {name}-{version}")] - AlreadyInstalled { name: String, version: String }, - - // ── Dependencies ── - #[error("Dependency resolution failed for {package}: {message}")] - DependencyResolution { package: String, message: String }, - - #[error("Unresolvable dependencies for {package}:\n{}", format_missing_deps(.missing))] - UnresolvableDeps { - package: String, - missing: Vec, - }, - - #[error("Dependency cycle detected: {}", .chain.join(" → "))] - DependencyCycle { chain: Vec }, - #[error("Conflict: {installed} conflicts with {requested}")] PackageConflict { installed: String, requested: String, }, - // ── Database ── - #[error("Database error: {source}")] - Database { - #[from] - source: redb::Error, - }, - // ── Network ── - #[error("Network error: {source}")] - Network { - #[from] - source: reqwest::Error, - }, - #[error("Download failed for {url} after {attempts} attempts: {message}")] DownloadFailed { url: String, @@ -96,10 +63,6 @@ pub enum ZlError { source: std::io::Error, }, - // ── Verification ── - #[error("Verification failed:\n{0}")] - Verification(String), - // ── Serialization ── #[error("Serialization error: {source}")] Serialization { @@ -113,27 +76,22 @@ pub enum ZlError { // ── GPG/Signature ── #[error("GPG signature verification failed for {path}: {message}")] - GpgVerification { - path: std::path::PathBuf, - message: String, - }, + GpgVerification { path: PathBuf, message: String }, // ── Self-update ── #[error("Self-update failed: {0}")] SelfUpdate(String), + // ── Verification ── + #[allow(dead_code)] + #[error("Verification failed:\n{0}")] + Verification(String), + // ── Environments ── #[error("Environment error: {0}")] Environment(String), } -fn format_missing_deps(deps: &[String]) -> String { - deps.iter() - .map(|d| format!(" - {}", d)) - .collect::>() - .join("\n") -} - impl ZlError { /// User-friendly suggestion for how to fix or work around this error. pub fn suggestion(&self) -> Option<&str> { @@ -141,9 +99,6 @@ impl ZlError { ZlError::PackageNotFound { .. } => { Some("Check the package name or try a different source with --from") } - ZlError::UnresolvableDeps { .. } => { - Some("Try installing the missing dependencies manually first") - } ZlError::DownloadFailed { .. } => { Some("Check your internet connection or try again later") } @@ -156,7 +111,6 @@ impl ZlError { ZlError::BuildToolMissing { .. } => { Some("Install the required build tool with your system package manager") } - ZlError::DependencyCycle { .. } => Some("This is a packaging bug — report it upstream"), ZlError::PackageConflict { .. } => { Some("Remove the conflicting package first with `zl remove`") } diff --git a/src/main.rs b/src/main.rs index 3a51068..9483dc2 100644 --- a/src/main.rs +++ b/src/main.rs @@ -31,10 +31,10 @@ fn main() { if let Err(e) = run(cli_args) { eprintln!("error: {}", e); - if let Some(zl_err) = e.downcast_ref::() { - if let Some(hint) = zl_err.suggestion() { - eprintln!(" hint: {}", hint); - } + if let Some(zl_err) = e.downcast_ref::() + && let Some(hint) = zl_err.suggestion() + { + eprintln!(" hint: {}", hint); } std::process::exit(1); } @@ -95,87 +95,34 @@ fn run(cli_args: cli::Cli) -> anyhow::Result<()> { github.init(&config.plugin_config("github"))?; registry.register(Box::new(github)); - let auto_yes = cli_args.global.yes || config.general.auto_confirm; - let dry_run = cli_args.global.dry_run; - let skip_verify = cli_args.global.skip_verify; + let ctx = cli::AppContext { + paths: &zl_paths, + db: &db, + registry: ®istry, + profile: &profile, + auto_yes: cli_args.global.yes || config.general.auto_confirm, + dry_run: cli_args.global.dry_run, + skip_verify: cli_args.global.skip_verify, + }; // Dispatch commands match cli_args.command { - cli::Commands::Install(args) => { - cli::install::handle( - args, - &zl_paths, - &db, - ®istry, - &profile, - auto_yes, - dry_run, - skip_verify, - )?; - } - cli::Commands::Remove(args) => { - cli::remove::handle(args, &zl_paths, &db, auto_yes, dry_run)?; - } - cli::Commands::Search(args) => { - cli::search::handle(args, ®istry)?; - } - cli::Commands::Update(args) => { - cli::update::handle( - args, - &zl_paths, - &db, - ®istry, - &profile, - auto_yes, - dry_run, - skip_verify, - )?; - } - cli::Commands::Upgrade(args) => { - cli::upgrade::handle( - args, - &zl_paths, - &db, - ®istry, - &profile, - auto_yes, - dry_run, - skip_verify, - )?; - } - cli::Commands::List(args) => { - cli::list::handle(args, &db)?; - } - cli::Commands::Info(args) => { - cli::info::handle(args, &db)?; - } - cli::Commands::Cache(cmd) => { - cli::cache::handle(cmd, &zl_paths)?; - } - cli::Commands::Completions(_) => { - unreachable!("handled above"); - } - cli::Commands::Pin(args) => { - cli::pin::handle_pin(args, &db)?; - } - cli::Commands::Unpin(args) => { - cli::pin::handle_unpin(args, &db)?; - } - cli::Commands::Export(args) => { - cli::lockfile::handle_export(args, &db)?; - } - cli::Commands::Import(args) => { - cli::lockfile::handle_import(args, &db)?; - } - cli::Commands::Switch(args) => { - cli::install::handle_switch(args, &zl_paths, &db)?; - } - cli::Commands::SelfUpdate => { - unreachable!("handled above"); - } - cli::Commands::Env(cmd) => { - cli::env::handle(cmd, &zl_paths, &config, &profile)?; - } + cli::Commands::Install(args) => cli::install::handle(args, &ctx)?, + cli::Commands::Remove(args) => cli::remove::handle(args, &ctx)?, + cli::Commands::Search(args) => cli::search::handle(args, ctx.registry)?, + cli::Commands::Update(args) => cli::update::handle(args, &ctx)?, + cli::Commands::Upgrade(args) => cli::upgrade::handle(args, &ctx)?, + cli::Commands::List(args) => cli::list::handle(args, ctx.db)?, + cli::Commands::Info(args) => cli::info::handle(args, ctx.db)?, + cli::Commands::Cache(cmd) => cli::cache::handle(cmd, ctx.paths)?, + cli::Commands::Completions(_) => unreachable!("handled above"), + cli::Commands::Pin(args) => cli::pin::handle_pin(args, ctx.db)?, + cli::Commands::Unpin(args) => cli::pin::handle_unpin(args, ctx.db)?, + cli::Commands::Export(args) => cli::lockfile::handle_export(args, ctx.db)?, + cli::Commands::Import(args) => cli::lockfile::handle_import(args, ctx.db)?, + cli::Commands::Switch(args) => cli::install::handle_switch(args, ctx.paths, ctx.db)?, + cli::Commands::SelfUpdate => unreachable!("handled above"), + cli::Commands::Env(cmd) => cli::env::handle(cmd, ctx.paths, &config, ctx.profile)?, } Ok(()) diff --git a/src/plugin/apt/deb.rs b/src/plugin/apt/deb.rs index 03a328e..c499f51 100644 --- a/src/plugin/apt/deb.rs +++ b/src/plugin/apt/deb.rs @@ -103,15 +103,18 @@ pub fn extract(deb_path: &Path, metadata: PackageCandidate) -> ZlResult(reader: R, dest: &Path) -> ZlResult<()> { let mut archive = tar::Archive::new(reader); archive.set_preserve_permissions(false); - archive.unpack(dest).map_err(|e| { - ZlError::Archive(format!("tar extraction failed: {}", e)) - }) + archive + .unpack(dest) + .map_err(|e| ZlError::Archive(format!("tar extraction failed: {}", e))) } fn is_script(path: &Path) -> bool { if let Some(ext) = path.extension() { let ext = ext.to_string_lossy(); - if matches!(ext.as_ref(), "sh" | "bash" | "py" | "pl" | "rb" | "lua" | "fish") { + if matches!( + ext.as_ref(), + "sh" | "bash" | "py" | "pl" | "rb" | "lua" | "fish" + ) { return true; } } @@ -125,7 +128,11 @@ fn is_script(path: &Path) -> bool { } /// Download a .deb file from a URL into dest_dir, verify checksum -pub fn download_deb(url: &str, expected_sha256: Option<&str>, dest_dir: &Path) -> ZlResult { +pub fn download_deb( + url: &str, + expected_sha256: Option<&str>, + dest_dir: &Path, +) -> ZlResult { let filename = url.rsplit('/').next().unwrap_or("package.deb"); let dest_path = dest_dir.join(filename); diff --git a/src/plugin/apt/index.rs b/src/plugin/apt/index.rs index f3a6a70..ac88dfe 100644 --- a/src/plugin/apt/index.rs +++ b/src/plugin/apt/index.rs @@ -39,10 +39,11 @@ pub fn parse(content: &str) -> Vec { if line.starts_with(' ') || line.starts_with('\t') { // Continuation line — append to current field (with newline separator) - if !current_key.is_empty() { - fields - .get_mut(current_key) - .map(|v| { v.push('\n'); v.push_str(line.trim_start()); }); + if !current_key.is_empty() + && let Some(v) = fields.get_mut(current_key) + { + v.push('\n'); + v.push_str(line.trim_start()); } } else if let Some((key, value)) = line.split_once(": ") { current_key = key; @@ -55,10 +56,10 @@ pub fn parse(content: &str) -> Vec { } // Handle final record if file doesn't end with blank line - if !fields.is_empty() { - if let Some(entry) = build_entry(&fields) { - entries.push(entry); - } + if !fields.is_empty() + && let Some(entry) = build_entry(&fields) + { + entries.push(entry); } entries @@ -148,10 +149,7 @@ Description: small, friendly text editor inspired by Pico assert_eq!(vim.depends, vec!["vim-common", "libacl1"]); assert_eq!(vim.conflicts, vec!["vim-tiny"]); assert_eq!(vim.provides, vec!["editor"]); - assert_eq!( - vim.description, - "Vi IMproved - enhanced vi editor" - ); + assert_eq!(vim.description, "Vi IMproved - enhanced vi editor"); let nano = &entries[1]; assert_eq!(nano.name, "nano"); diff --git a/src/plugin/apt/mod.rs b/src/plugin/apt/mod.rs index 7609eaf..b4ad222 100644 --- a/src/plugin/apt/mod.rs +++ b/src/plugin/apt/mod.rs @@ -45,8 +45,8 @@ pub struct AptPlugin { db: RwLock>, } -impl AptPlugin { - pub fn new() -> Self { +impl Default for AptPlugin { + fn default() -> Self { Self { mirror: DEFAULT_MIRROR.to_string(), suite: DEFAULT_SUITE.to_string(), @@ -56,6 +56,12 @@ impl AptPlugin { db: RwLock::new(Vec::new()), } } +} + +impl AptPlugin { + pub fn new() -> Self { + Self::default() + } fn entry_to_candidate(&self, entry: &AptEntry) -> PackageCandidate { let download_url = format!("{}/{}", self.mirror.trim_end_matches('/'), entry.filename); @@ -117,11 +123,11 @@ impl AptPlugin { message: format!("HTTP {}", resp.status()), }); } - Ok(resp.bytes().map_err(|e| ZlError::DownloadFailed { + resp.bytes().map_err(|e| ZlError::DownloadFailed { url: url.clone(), attempts: attempt, message: e.to_string(), - })?) + }) })?; std::fs::write(&cache_path, &bytes)?; @@ -142,10 +148,10 @@ impl AptPlugin { let mut all = Vec::new(); for component in &self.components { let path = self.packages_cache_path(component); - if let Ok(bytes) = std::fs::read(&path) { - if let Ok(content) = decompress_gz(&bytes) { - all.extend(index::parse(&content)); - } + if let Ok(bytes) = std::fs::read(&path) + && let Ok(content) = decompress_gz(&bytes) + { + all.extend(index::parse(&content)); } } all @@ -208,7 +214,9 @@ impl SourcePlugin for AptPlugin { let q = query.to_lowercase(); Ok(db .iter() - .filter(|e| e.name.to_lowercase().contains(&q) || e.description.to_lowercase().contains(&q)) + .filter(|e| { + e.name.to_lowercase().contains(&q) || e.description.to_lowercase().contains(&q) + }) .map(|e| self.entry_to_candidate(e)) .collect()) } @@ -217,9 +225,7 @@ impl SourcePlugin for AptPlugin { let db = self.db.read().unwrap(); Ok(db .iter() - .find(|e| { - e.name == name && version.map_or(true, |v| e.version == v) - }) + .find(|e| e.name == name && version.is_none_or(|v| e.version == v)) .map(|e| self.entry_to_candidate(e))) } diff --git a/src/plugin/aur/mod.rs b/src/plugin/aur/mod.rs index fe2cdff..f19d6c0 100644 --- a/src/plugin/aur/mod.rs +++ b/src/plugin/aur/mod.rs @@ -47,8 +47,8 @@ pub struct AurPlugin { client: reqwest::blocking::Client, } -impl AurPlugin { - pub fn new() -> Self { +impl Default for AurPlugin { + fn default() -> Self { Self { cache_dir: PathBuf::new(), client: reqwest::blocking::Client::builder() @@ -57,6 +57,12 @@ impl AurPlugin { .unwrap_or_default(), } } +} + +impl AurPlugin { + pub fn new() -> Self { + Self::default() + } fn to_candidate(pkg: &AurPackage) -> PackageCandidate { PackageCandidate { @@ -80,7 +86,11 @@ impl AurPlugin { .client .get(url) .timeout(std::time::Duration::from_secs(30)) - .send()?; + .send() + .map_err(|e| ZlError::Plugin { + plugin: "aur".into(), + message: format!("AUR RPC request failed: {}", e), + })?; if !resp.status().is_success() { return Err(ZlError::Plugin { @@ -131,7 +141,7 @@ impl SourcePlugin for AurPlugin { .map(Self::to_candidate); // If a version was requested, check it matches - Ok(candidate.filter(|c| version.map_or(true, |v| c.version == v))) + Ok(candidate.filter(|c| version.is_none_or(|v| c.version == v))) } fn download(&self, candidate: &PackageCandidate, dest_dir: &Path) -> ZlResult { diff --git a/src/plugin/github/mod.rs b/src/plugin/github/mod.rs index 21e7e00..ad9e5d0 100644 --- a/src/plugin/github/mod.rs +++ b/src/plugin/github/mod.rs @@ -56,8 +56,8 @@ pub struct GithubPlugin { client: reqwest::blocking::Client, } -impl GithubPlugin { - pub fn new() -> Self { +impl Default for GithubPlugin { + fn default() -> Self { Self { token: None, cache_dir: PathBuf::new(), @@ -67,15 +67,27 @@ impl GithubPlugin { .unwrap_or_default(), } } +} + +impl GithubPlugin { + pub fn new() -> Self { + Self::default() + } fn get(&self, url: &str) -> ZlResult { - let mut req = self.client.get(url).timeout(std::time::Duration::from_secs(30)); + let mut req = self + .client + .get(url) + .timeout(std::time::Duration::from_secs(30)); if let Some(ref token) = self.token { req = req.header("Authorization", format!("Bearer {}", token)); } - let resp = req.send()?; + let resp = req.send().map_err(|e| ZlError::Plugin { + plugin: "github".into(), + message: format!("GitHub API request failed: {}", e), + })?; if resp.status() == reqwest::StatusCode::NOT_FOUND { return Err(ZlError::PackageNotFound { @@ -105,7 +117,11 @@ impl GithubPlugin { }) } - fn release_to_candidate(&self, owner_repo: &str, release: &GhRelease) -> ZlResult { + fn release_to_candidate( + &self, + owner_repo: &str, + release: &GhRelease, + ) -> ZlResult { let asset = pick_best_asset(&release.assets).ok_or_else(|| ZlError::Plugin { plugin: "github".into(), message: format!( @@ -113,13 +129,22 @@ impl GithubPlugin { owner_repo, release.tag_name, std::env::consts::ARCH, - release.assets.iter().map(|a| a.name.as_str()).collect::>().join(", ") + release + .assets + .iter() + .map(|a| a.name.as_str()) + .collect::>() + .join(", ") ), })?; // Strip leading "v" from tag for the version field let version = release.tag_name.trim_start_matches('v').to_string(); - let name = owner_repo.split('/').last().unwrap_or(owner_repo).to_string(); + let name = owner_repo + .rsplit('/') + .next() + .unwrap_or(owner_repo) + .to_string(); Ok(PackageCandidate { name, @@ -180,10 +205,10 @@ impl SourcePlugin for GithubPlugin { for repo in &resp.items { // Try to get the latest release for each result let rel_url = format!("{}/repos/{}/releases/latest", GITHUB_API, repo.full_name); - if let Ok(release) = self.get::(&rel_url) { - if let Ok(candidate) = self.release_to_candidate(&repo.full_name, &release) { - candidates.push(candidate); - } + if let Ok(release) = self.get::(&rel_url) + && let Ok(candidate) = self.release_to_candidate(&repo.full_name, &release) + { + candidates.push(candidate); } } @@ -235,7 +260,8 @@ impl SourcePlugin for GithubPlugin { if attempt > 1 { tracing::info!("Retry {}/3 for {}", attempt, filename); } - let mut req = self.client + let mut req = self + .client .get(url) .timeout(std::time::Duration::from_secs(600)); @@ -257,11 +283,11 @@ impl SourcePlugin for GithubPlugin { }); } - Ok(resp.bytes().map_err(|e| ZlError::DownloadFailed { + resp.bytes().map_err(|e| ZlError::DownloadFailed { url: url.to_string(), attempts: attempt, message: e.to_string(), - })?) + }) })?; std::fs::write(&dest_path, &bytes)?; @@ -321,7 +347,9 @@ fn pick_best_asset(assets: &[GhAsset]) -> Option<&GhAsset> { // Must match current architecture (or be "any"/"all") let arch_match = arch_patterns.iter().any(|p| name_lower.contains(p)) || name_lower.contains("linux-unknown") - || (!name_lower.contains("x86") && !name_lower.contains("aarch") && !name_lower.contains("arm")); + || (!name_lower.contains("x86") + && !name_lower.contains("aarch") + && !name_lower.contains("arm")); if !arch_match { continue; @@ -342,9 +370,7 @@ fn pick_best_asset(assets: &[GhAsset]) -> Option<&GhAsset> { // Prefer compressed archives over bare binaries if name_lower.ends_with(".tar.gz") || name_lower.ends_with(".tgz") { score -= 4; - } else if name_lower.ends_with(".tar.xz") { - score -= 3; - } else if name_lower.ends_with(".tar.zst") { + } else if name_lower.ends_with(".tar.xz") || name_lower.ends_with(".tar.zst") { score -= 3; } else if name_lower.ends_with(".zip") { score -= 2; @@ -353,7 +379,7 @@ fn pick_best_asset(assets: &[GhAsset]) -> Option<&GhAsset> { } // Bare binary stays at 0 bonus - if best.map_or(true, |(_, best_score)| score < best_score) { + if best.is_none_or(|(_, best_score)| score < best_score) { best = Some((asset, score)); } } @@ -424,7 +450,8 @@ fn extract_zip(archive: &Path, dest: &Path) -> ZlResult<()> { .map_err(|e| ZlError::Archive(format!("zip open failed: {}", e)))?; for i in 0..zip.len() { - let mut entry = zip.by_index(i) + let mut entry = zip + .by_index(i) .map_err(|e| ZlError::Archive(format!("zip entry error: {}", e)))?; let outpath = dest.join(entry.name()); @@ -547,7 +574,10 @@ fn classify_extracted( fn is_script(path: &Path) -> bool { if let Some(ext) = path.extension() { let ext = ext.to_string_lossy(); - if matches!(ext.as_ref(), "sh" | "bash" | "py" | "pl" | "rb" | "lua" | "fish") { + if matches!( + ext.as_ref(), + "sh" | "bash" | "py" | "pl" | "rb" | "lua" | "fish" + ) { return true; } } diff --git a/src/plugin/mod.rs b/src/plugin/mod.rs index e6e9e1c..e491aee 100644 --- a/src/plugin/mod.rs +++ b/src/plugin/mod.rs @@ -25,6 +25,7 @@ pub struct PackageCandidate { pub struct ExtractedPackage { pub extract_dir: tempfile::TempDir, + #[allow(dead_code)] pub metadata: PackageCandidate, pub files: Vec, pub elf_files: Vec, @@ -42,15 +43,14 @@ pub trait SourcePlugin: Send + Sync { fn sync(&self) -> ZlResult<()>; } +#[derive(Default)] pub struct PluginRegistry { plugins: Vec>, } impl PluginRegistry { pub fn new() -> Self { - Self { - plugins: Vec::new(), - } + Self::default() } pub fn register(&mut self, plugin: Box) { diff --git a/src/plugin/pacman/mirror.rs b/src/plugin/pacman/mirror.rs index 33c12b4..ebb3719 100644 --- a/src/plugin/pacman/mirror.rs +++ b/src/plugin/pacman/mirror.rs @@ -4,6 +4,7 @@ use crate::error::ZlResult; #[derive(Debug, Clone)] pub struct Mirror { pub url: String, + #[allow(dead_code)] pub country: Option, } diff --git a/src/plugin/pacman/mod.rs b/src/plugin/pacman/mod.rs index d94085e..ceeeb35 100644 --- a/src/plugin/pacman/mod.rs +++ b/src/plugin/pacman/mod.rs @@ -25,8 +25,8 @@ pub struct PacmanPlugin { repos: Vec, } -impl PacmanPlugin { - pub fn new() -> Self { +impl Default for PacmanPlugin { + fn default() -> Self { Self { mirrors: Vec::new(), db_cache: RwLock::new(Vec::new()), @@ -35,6 +35,12 @@ impl PacmanPlugin { repos: DEFAULT_REPOS.iter().map(|s| s.to_string()).collect(), } } +} + +impl PacmanPlugin { + pub fn new() -> Self { + Self::default() + } fn primary_mirror(&self) -> ZlResult<&Mirror> { self.mirrors.first().ok_or_else(|| ZlError::Plugin { @@ -176,12 +182,12 @@ impl SourcePlugin for PacmanPlugin { } // Fall back to checking provides (virtual packages) - if version.is_none() { - if let Some((repo, entry)) = self.find_by_provides(name) { - let mirror = self.primary_mirror()?; - tracing::debug!("Resolved '{}' via provides from '{}'", name, entry.name); - return Ok(Some(database::entry_to_candidate(&entry, mirror, &repo))); - } + if version.is_none() + && let Some((repo, entry)) = self.find_by_provides(name) + { + let mirror = self.primary_mirror()?; + tracing::debug!("Resolved '{}' via provides from '{}'", name, entry.name); + return Ok(Some(database::entry_to_candidate(&entry, mirror, &repo))); } Ok(None) diff --git a/src/system/arch.rs b/src/system/arch.rs index 5121995..08eca7a 100644 --- a/src/system/arch.rs +++ b/src/system/arch.rs @@ -36,11 +36,11 @@ impl Arch { /// Uses `std::env::consts::ARCH` (compile-time) as primary, which is always correct /// for the running binary. Falls back to parsing uname-style strings. pub fn detect() -> Self { - Self::from_str(std::env::consts::ARCH) + Self::parse(std::env::consts::ARCH) } /// Parse an architecture string (as returned by `uname -m` or similar). - pub fn from_str(s: &str) -> Self { + pub fn parse(s: &str) -> Self { match s { "x86_64" | "amd64" => Arch::X86_64, "aarch64" | "arm64" => Arch::Aarch64, @@ -68,6 +68,7 @@ impl Arch { } /// The Pacman repo name for this architecture. + #[allow(dead_code)] pub fn pacman_name(&self) -> &str { match self { Arch::X86_64 => "x86_64", @@ -92,14 +93,14 @@ mod tests { #[test] fn test_arch_from_str() { - assert_eq!(Arch::from_str("x86_64"), Arch::X86_64); - assert_eq!(Arch::from_str("amd64"), Arch::X86_64); - assert_eq!(Arch::from_str("aarch64"), Arch::Aarch64); - assert_eq!(Arch::from_str("arm64"), Arch::Aarch64); - assert_eq!(Arch::from_str("armv7l"), Arch::Armv7); - assert_eq!(Arch::from_str("riscv64"), Arch::Riscv64); - assert_eq!(Arch::from_str("i686"), Arch::I686); - assert_eq!(Arch::from_str("s390x"), Arch::S390x); + assert_eq!(Arch::parse("x86_64"), Arch::X86_64); + assert_eq!(Arch::parse("amd64"), Arch::X86_64); + assert_eq!(Arch::parse("aarch64"), Arch::Aarch64); + assert_eq!(Arch::parse("arm64"), Arch::Aarch64); + assert_eq!(Arch::parse("armv7l"), Arch::Armv7); + assert_eq!(Arch::parse("riscv64"), Arch::Riscv64); + assert_eq!(Arch::parse("i686"), Arch::I686); + assert_eq!(Arch::parse("s390x"), Arch::S390x); } #[test] diff --git a/src/system/detect.rs b/src/system/detect.rs index f97d2c3..726ba09 100644 --- a/src/system/detect.rs +++ b/src/system/detect.rs @@ -5,7 +5,7 @@ use std::path::Path; #[derive(Debug, Clone, PartialEq, Eq, serde::Serialize, serde::Deserialize)] pub enum SystemLayout { /// Standard FHS with separate /bin, /lib, /usr/bin, /usr/lib - FHS, + Fhs, /// Merged /usr: /bin → /usr/bin, /lib → /usr/lib (most modern distros) MergedUsr, /// NixOS: everything in /nix/store, declarative system @@ -23,7 +23,7 @@ pub enum SystemLayout { impl fmt::Display for SystemLayout { fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result { let s = match self { - SystemLayout::FHS => "FHS", + SystemLayout::Fhs => "FHS", SystemLayout::MergedUsr => "Merged /usr", SystemLayout::NixOS => "NixOS", SystemLayout::Guix => "GNU Guix", @@ -36,9 +36,9 @@ impl fmt::Display for SystemLayout { } impl SystemLayout { - pub fn from_str(s: &str) -> Self { + pub fn parse(s: &str) -> Self { match s.to_lowercase().as_str() { - "fhs" => SystemLayout::FHS, + "fhs" => SystemLayout::Fhs, "merged" | "merged_usr" | "mergedusr" => SystemLayout::MergedUsr, "nixos" | "nix" => SystemLayout::NixOS, "guix" => SystemLayout::Guix, @@ -82,7 +82,7 @@ pub fn detect_layout() -> SystemLayout { } // Default: standard FHS - SystemLayout::FHS + SystemLayout::Fhs } /// Check if `link` is a symlink pointing to `target` (directly or via canonical path). @@ -146,9 +146,9 @@ mod tests { #[test] fn test_layout_from_str() { - assert_eq!(SystemLayout::from_str("fhs"), SystemLayout::FHS); - assert_eq!(SystemLayout::from_str("nixos"), SystemLayout::NixOS); - assert_eq!(SystemLayout::from_str("merged"), SystemLayout::MergedUsr); - assert_eq!(SystemLayout::from_str("termux"), SystemLayout::Termux); + assert_eq!(SystemLayout::parse("fhs"), SystemLayout::Fhs); + assert_eq!(SystemLayout::parse("nixos"), SystemLayout::NixOS); + assert_eq!(SystemLayout::parse("merged"), SystemLayout::MergedUsr); + assert_eq!(SystemLayout::parse("termux"), SystemLayout::Termux); } } diff --git a/src/system/mod.rs b/src/system/mod.rs index 7f951af..2e59f1a 100644 --- a/src/system/mod.rs +++ b/src/system/mod.rs @@ -22,6 +22,7 @@ pub struct SystemProfile { /// CPU architecture pub arch: Arch, /// Whether the system is 64-bit + #[allow(dead_code)] pub is_64bit: bool, /// Kernel page size (for ELF patching) pub page_size: u64, @@ -93,7 +94,7 @@ impl SystemProfile { } if let Some(ref layout_str) = config.layout { - self.layout = SystemLayout::from_str(layout_str); + self.layout = SystemLayout::parse(layout_str); tracing::info!("Layout overridden to: {}", self.layout); } @@ -112,6 +113,7 @@ impl SystemProfile { } /// Find the full path of a system library, or None. + #[allow(dead_code)] pub fn find_system_lib(&self, lib_name: &str) -> Option { for dir in &self.lib_dirs { let path = dir.join(lib_name); diff --git a/src/system/paths.rs b/src/system/paths.rs index 8a6df50..2305e13 100644 --- a/src/system/paths.rs +++ b/src/system/paths.rs @@ -116,12 +116,11 @@ pub fn detect_multiarch_tuple() -> Option { if let Ok(output) = std::process::Command::new("dpkg-architecture") .arg("-qDEB_HOST_MULTIARCH") .output() + && output.status.success() { - if output.status.success() { - let tuple = String::from_utf8_lossy(&output.stdout).trim().to_string(); - if !tuple.is_empty() && Path::new(&format!("/usr/lib/{}", tuple)).is_dir() { - return Some(tuple); - } + let tuple = String::from_utf8_lossy(&output.stdout).trim().to_string(); + if !tuple.is_empty() && Path::new(&format!("/usr/lib/{}", tuple)).is_dir() { + return Some(tuple); } } @@ -314,7 +313,7 @@ mod tests { #[test] fn test_discover_lib_dirs_not_empty() { - let dirs = discover_lib_dirs(&SystemLayout::FHS); + let dirs = discover_lib_dirs(&SystemLayout::Fhs); assert!( !dirs.is_empty(), "Should find at least some lib directories" From 19fad3dbf6e3346d0e9fa3c5a9e82c90f0892abf Mon Sep 17 00:00:00 2001 From: Claude Date: Mon, 23 Feb 2026 09:02:14 +0000 Subject: [PATCH 3/5] docs: add naming conventions section to CLAUDE.md https://claude.ai/code/session_01W29EWLxxuZuRnfP9pJjxuB --- CLAUDE.md | 9 +++++++++ 1 file changed, 9 insertions(+) diff --git a/CLAUDE.md b/CLAUDE.md index 30ebe27..434aef9 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -130,3 +130,12 @@ Each CLI command lives in `src/cli/.rs` with a `pub fn handle(...)` fun - **Zero clippy warnings**: `cargo clippy -- -D warnings` passes clean - **Zero `cargo fmt` diff**: all code is formatted - **186 tests**: comprehensive coverage of core modules (conflicts, ELF, path mapping, DB, graph, transaction, verify, plugins, system detection) + +### Naming conventions + +- `SystemLayout` variants use PascalCase: `Fhs`, `MergedUsr`, `NixOS`, `Guix`, `Termux`, `GoboLinux`, `Custom` +- `Conflict` variants avoid repeating the enum name: `Declared` (not `DeclaredConflict`), `Version` (not `VersionConflict`) +- `Arch::parse()` and `SystemLayout::parse()` instead of `from_str()` (avoids confusion with `std::str::FromStr` trait) +- Structs with simple `new()` constructors also implement `Default` (via `#[derive(Default)]` or manual impl): `PluginRegistry`, `PacmanPlugin`, `AptPlugin`, `AurPlugin`, `GithubPlugin`, `DepGraph`, `Transaction` +- The `core/build/` module uses `#![allow(dead_code)]` since it is scaffolding for future source-build support +- `DepGraph`, `DependencyEdge`, `DepType` have `#[allow(dead_code)]` — they are part of the graph model used for future features From f17e606c58b951c07cd47623ed4201bddf347c6a Mon Sep 17 00:00:00 2001 From: Claude Date: Mon, 23 Feb 2026 10:10:57 +0000 Subject: [PATCH 4/5] feat: smart search, universal source picker, AUR binary variants, safe cascade, better errors Major improvements to search, install, remove, and error handling: Search (`zl search`): - Parallel queries across all plugins via thread::scope - Relevance scoring: exact(100), starts-with(80), contains(60), desc(30) - New flags: --sort (relevance|name|version), --exact - Result count per source, truncated descriptions, tag annotations Install (`zl install`): - Universal source picker queries ALL plugins in parallel - AUR -bin/-appimage/-prebuilt variants auto-discovered and offered - Clear labeling: [binary], [source], [release] tags in picker - Architecture compatibility check on ELF binaries before patching - Post-install missing shared library warnings AUR plugin: - Binary variant discovery: searches for -bin/-appimage/-prebuilt - [binary] tag in descriptions for precompiled packages - Better build error messages with specific hints (base-devel, PGP) - Captures stderr from git/makepkg for actionable error output Remove (`zl remove --cascade`): - Preview what cascade will/won't remove before prompting - Shows "Keeping X (needed by Y)" for shared dependencies - Dry-run support shows full removal plan - Only removes ZL-managed implicit dependencies, never system pkgs Error handling: - Plugin-specific suggestions: pacman mirrors, APT repos, GitHub rate limits, AUR build deps, arch mismatches, self-update perms - New ArchMismatch error variant - elf_machine_to_arch() + check_arch_compat() in ELF analysis 195 tests pass (103 bin + 92 lib), zero clippy warnings, zero fmt diff. https://claude.ai/code/session_01W29EWLxxuZuRnfP9pJjxuB --- CLAUDE.md | 40 +++-- src/cli/install.rs | 245 +++++++++++++++++++++++------- src/cli/mod.rs | 16 +- src/cli/remove.rs | 228 ++++++++++++++++++++++++++-- src/cli/search.rs | 317 ++++++++++++++++++++++++++++++++++----- src/core/elf/analysis.rs | 48 ++++++ src/error.rs | 72 ++++++++- src/plugin/aur/mod.rs | 131 ++++++++++++++-- 8 files changed, 972 insertions(+), 125 deletions(-) diff --git a/CLAUDE.md b/CLAUDE.md index 434aef9..1be8768 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -21,7 +21,7 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co cargo build # Debug build cargo build --release # Release build cargo run -- # Run (e.g., cargo run -- install firefox) -cargo test # Run all tests (186 tests: 90 bin + 96 lib) +cargo test # Run all tests (195 tests: 103 bin + 92 lib) cargo test # Run a single test by name cargo test -- --nocapture # Run tests with stdout visible cargo clippy # Lint @@ -38,16 +38,33 @@ There are no integration tests — all tests are unit tests inside `#[cfg(test)] ### Install flow (the core operation) 1. **Detect system** — `SystemProfile::detect()` auto-detects arch, dynamic linker, libc, lib dirs, filesystem layout -2. **Resolve source** — if `--from` omitted, queries ALL plugins in parallel; user picks if multiple hits (`pick_source()` in `cli/install.rs`) +2. **Resolve source** — if `--from` omitted, queries ALL plugins in parallel (including AUR `-bin` variants); user picks from full list via interactive `dialoguer::Select` (`pick_source()` in `cli/install.rs`) 3. **Resolve deps** — recursive dependency resolution with cycle detection (`cli/deps.rs`) 4. **Check conflicts** — 5 types: file ownership, binary name, library soname, declared conflicts, version constraints (`core/conflicts.rs`) 5. **Download** in parallel (4 threads via `thread::scope`) with progress bars and retry 6. **Verify** — SHA256 checksum + GPG signature (best-effort) 7. **Extract** and analyze ELF binaries with `goblin` -8. **Patch** ELF binaries with `elb` (set interpreter, RUNPATH) using detected page size -9. **Remap** FHS paths to ZL-managed directories (`core/path/`) -10. **Install** atomically — `Transaction` tracks all changes; rollback on any failure -11. **Track** in redb database + dependency graph +8. **Arch check** — verify ELF `e_machine` matches host architecture before patching (warning on mismatch) +9. **Patch** ELF binaries with `elb` (set interpreter, RUNPATH) using detected page size +10. **Remap** FHS paths to ZL-managed directories (`core/path/`) +11. **Install** atomically — `Transaction` tracks all changes; rollback on any failure +12. **Post-install checks** — warn about missing shared libraries not found in ZL DB or system lib dirs +13. **Track** in redb database + dependency graph + +### Search flow (`zl search`) + +1. **Parallel queries** — all plugins are queried via `thread::scope` simultaneously (not sequentially) +2. **Relevance scoring** — each result is scored: exact name match (100), starts-with (80), contains in name (60), description-only (30) +3. **AUR binary discovery** — if the query doesn't end with `-bin`, automatically also fetches `-bin`, `-appimage`, `-prebuilt` variants and tags them `[binary]` +4. **Sorted output** — results sorted by relevance (default), name, or version via `--sort` +5. **Filtering** — `--exact` shows only exact name matches; `--from` limits to a single source; `--limit` controls results per source + +### Removal flow (`zl remove --cascade`) + +1. **Preview before action** — `--cascade` always shows what will and won't be removed before prompting +2. **Orphan detection** — only removes packages that are: (a) tracked in ZL's DB, (b) marked as implicit (`explicit: false`), (c) not depended on by any remaining package (checked via both dependency table and shared lib needs) +3. **Shared dep protection** — dependencies used by other packages are listed as "Keeping (needed by X)" and never removed +4. **Dry-run support** — `--dry-run` with `--cascade` shows the full removal plan without touching anything ### Startup flow (`main.rs`) @@ -77,7 +94,7 @@ All plugins implement `SourcePlugin` and are registered in `main.rs`. To add a n 2. Add `pub mod ;` in `src/plugin/mod.rs` 3. Instantiate and register in `main.rs`'s `run()` function -Current plugins: `pacman` (Arch repos), `aur` (AUR RPC v5 + makepkg), `apt` (Packages.gz + .deb), `github` (Releases API). +Current plugins: `pacman` (Arch repos), `aur` (AUR RPC v5 + makepkg, with `-bin` variant discovery), `apt` (Packages.gz + .deb), `github` (Releases API). ### Command dispatch pattern @@ -86,6 +103,7 @@ Each CLI command lives in `src/cli/.rs` with a `pub fn handle(...)` fun ### Error handling - `ZlError` enum in `error.rs` (thiserror, boxed where needed to keep size small) for domain errors with `.suggestion()` hints +- Plugin-specific error suggestions: pacman mirror issues, APT repo failures, GitHub rate limits, AUR build failures (base-devel, PGP keys), architecture mismatches, self-update permissions - `anyhow::Result` at the top level (`run()` returns `anyhow::Result<()>`) - `retry_with_backoff()` in `error.rs` for HTTP retries (3 attempts: 1s, 2s, 4s) - Tracing: default level `warn`; `-v` = info, `-vv` = debug @@ -115,11 +133,11 @@ Each CLI command lives in `src/cli/.rs` with a `pub fn handle(...)` fun | Crate | Purpose | |-------|---------| -| `goblin` | Read ELF metadata (interpreter, needed libs, rpath, soname) | +| `goblin` | Read ELF metadata (interpreter, needed libs, rpath, soname, machine type) | | `elb` | Patch ELF binaries (set interpreter, set runpath) | | `petgraph` | Dependency graph with topological sort, cycle detection | | `redb` | Embedded key-value database (pure Rust, ACID) | -| `clap` (derive) | CLI argument parsing | +| `clap` (derive) | CLI argument parsing (with `ValueEnum` for `SortOrder`) | | `reqwest` (blocking+json) | HTTP client | | `tar` + `zstd` + `flate2` + `xz2` + `bzip2` + `ar` + `zip` | Archive formats | | `sha2` | SHA256 checksums | @@ -129,13 +147,15 @@ Each CLI command lives in `src/cli/.rs` with a `pub fn handle(...)` fun - **Zero clippy warnings**: `cargo clippy -- -D warnings` passes clean - **Zero `cargo fmt` diff**: all code is formatted -- **186 tests**: comprehensive coverage of core modules (conflicts, ELF, path mapping, DB, graph, transaction, verify, plugins, system detection) +- **195 tests**: comprehensive coverage of core modules (conflicts, ELF, path mapping, DB, graph, transaction, verify, plugins, search scoring, system detection) ### Naming conventions - `SystemLayout` variants use PascalCase: `Fhs`, `MergedUsr`, `NixOS`, `Guix`, `Termux`, `GoboLinux`, `Custom` - `Conflict` variants avoid repeating the enum name: `Declared` (not `DeclaredConflict`), `Version` (not `VersionConflict`) - `Arch::parse()` and `SystemLayout::parse()` instead of `from_str()` (avoids confusion with `std::str::FromStr` trait) +- `SortOrder` enum (in `cli/mod.rs`) uses `ValueEnum` derive for clap: `Relevance`, `Name`, `Version` - Structs with simple `new()` constructors also implement `Default` (via `#[derive(Default)]` or manual impl): `PluginRegistry`, `PacmanPlugin`, `AptPlugin`, `AurPlugin`, `GithubPlugin`, `DepGraph`, `Transaction` - The `core/build/` module uses `#![allow(dead_code)]` since it is scaffolding for future source-build support - `DepGraph`, `DependencyEdge`, `DepType` have `#[allow(dead_code)]` — they are part of the graph model used for future features +- `ArchMismatch` error variant has `#[allow(dead_code)]` — available for strict arch enforcement in future diff --git a/src/cli/install.rs b/src/cli/install.rs index a16421c..29c3dde 100644 --- a/src/cli/install.rs +++ b/src/cli/install.rs @@ -401,6 +401,18 @@ pub fn install_from_archive( let mapping = PathMapping::for_package(&paths.root, &candidate.name, &candidate.version, profile); + // Check architecture compatibility of ELF binaries before patching + for elf_path in &extracted.elf_files { + if let Ok(info) = analysis::analyze(elf_path) { + if let Err(msg) = analysis::check_arch_compat(&info, &profile.arch) { + eprintln!(" Warning: {}", msg); + eprintln!(" This package may not work on your system."); + } + // Only check the first ELF to avoid spam + break; + } + } + // Patch ELF binaries for elf_path in &extracted.elf_files { match analysis::analyze(elf_path) { @@ -526,7 +538,7 @@ pub fn install_from_archive( }, installed_files: installed_files.clone(), provides_libs: provides_libs.clone(), - needs_libs, + needs_libs: needs_libs.clone(), installed_at: now, explicit, }; @@ -563,6 +575,35 @@ pub fn install_from_archive( eprintln!(" Warning: {}", report); } + // Post-install: check for missing shared libraries + let mut missing_libs = Vec::new(); + for lib in &needs_libs { + // Check if provided by another ZL package + if db.lib_provider(lib)?.is_some() { + continue; + } + // Check if found on the host system + if profile.system_lib_exists(lib) { + continue; + } + // Check if provided by this package itself + if lib_index_paths.contains_key(lib) { + continue; + } + missing_libs.push(lib.as_str()); + } + if !missing_libs.is_empty() { + eprintln!( + " Warning: {} missing shared lib(s) for {}:", + missing_libs.len(), + candidate.name + ); + for lib in &missing_libs { + eprintln!(" - {}", lib); + } + eprintln!(" hint: install the providing package or check your system libraries"); + } + tracing::info!( "Installed {}-{} ({} files)", candidate.name, @@ -831,8 +872,22 @@ fn patch_desktop_exec(content: &str, _bin_dir: &Path) -> String { .join("\n") } +/// Represents a single install option found across all sources. +struct InstallOption { + /// Plugin name (e.g. "pacman", "aur", "github") + plugin_name: String, + /// The resolved package candidate + candidate: PackageCandidate, + /// Human-readable label for the interactive picker + label: String, +} + /// When `--from` is not specified, resolve from all plugins and pick a source. /// +/// Queries ALL sources in parallel (including AUR `-bin` variants) and presents +/// a comprehensive list so the user can choose between binary vs source builds, +/// different repos, etc. +/// /// - 0 results → PackageNotFound error /// - 1 result → auto-select (no prompt) /// - N results + auto_yes → pick the first (highest-priority plugin) @@ -845,74 +900,127 @@ fn pick_source( ) -> ZlResult { println!("Searching all sources for '{}'...", package); - let mut found: Vec<(String, String, String)> = Vec::new(); // (plugin_name, display_label, version) + let plugins = registry.all(); + let found: std::sync::Mutex> = std::sync::Mutex::new(Vec::new()); - for plugin in registry.all() { - // Sync first so the local DB is up to date before querying - if let Err(e) = plugin.sync() { - tracing::debug!( - "Failed to sync {} during source discovery: {}", - plugin.name(), - e - ); - } + // Query all plugins in parallel + std::thread::scope(|scope| { + let found = &found; + let mut handles = Vec::new(); + + for plugin in &plugins { + handles.push(scope.spawn(move || { + // Sync first so the local DB is up to date + if let Err(e) = plugin.sync() { + tracing::debug!( + "Failed to sync {} during source discovery: {}", + plugin.name(), + e + ); + } - match plugin.resolve(package, version) { - Ok(Some(candidate)) => { - let label = format!( - "{} {} [{}]{}", - candidate.name, - candidate.version, - candidate.source, - if candidate.description.is_empty() { - String::new() - } else { - format!( - " — {}", - candidate.description.chars().take(60).collect::() - ) + // Resolve exact name + match plugin.resolve(package, version) { + Ok(Some(candidate)) => { + let label = format_option_label(&candidate); + let mut f = found.lock().unwrap(); + f.push(InstallOption { + plugin_name: plugin.name().to_string(), + candidate, + label, + }); } - ); - found.push((plugin.name().to_string(), label, candidate.version)); - } - Ok(None) => {} - Err(e) => { - tracing::debug!( - "Plugin '{}' could not resolve '{}': {}", - plugin.name(), - package, - e - ); + Ok(None) => {} + Err(e) => { + tracing::debug!( + "Plugin '{}' could not resolve '{}': {}", + plugin.name(), + package, + e + ); + } + } + + // For AUR: also check -bin, -appimage, -prebuilt variants + if plugin.name() == "aur" { + for suffix in &["-bin", "-appimage", "-prebuilt"] { + let variant_name = format!("{}{}", package, suffix); + match plugin.resolve(&variant_name, None) { + Ok(Some(candidate)) => { + let label = format_option_label(&candidate); + let mut f = found.lock().unwrap(); + f.push(InstallOption { + plugin_name: plugin.name().to_string(), + candidate, + label, + }); + } + Ok(None) => {} + Err(_) => {} + } + } + } + })); + } + + for handle in handles { + if handle.join().is_err() { + tracing::warn!("A source discovery thread panicked"); } } - } + }); + + let mut found = found.into_inner().unwrap_or_default(); + + // Sort: exact name matches first, then by plugin priority order + let plugin_order: std::collections::HashMap<&str, usize> = plugins + .iter() + .enumerate() + .map(|(i, p)| (p.name(), i)) + .collect(); + + found.sort_by(|a, b| { + let a_exact = a.candidate.name == package; + let b_exact = b.candidate.name == package; + // Exact name matches first + b_exact + .cmp(&a_exact) + // Then by plugin priority + .then_with(|| { + let a_prio = plugin_order.get(a.plugin_name.as_str()).unwrap_or(&99); + let b_prio = plugin_order.get(b.plugin_name.as_str()).unwrap_or(&99); + a_prio.cmp(b_prio) + }) + }); match found.len() { 0 => Err(ZlError::PackageNotFound { name: package.to_string(), }), 1 => { - let (source, label, _) = &found[0]; - println!("Found: {}", label); - Ok(source.clone()) + let opt = &found[0]; + println!("Found: {}", opt.label); + Ok(opt.plugin_name.clone()) } _ if auto_yes => { - // Non-interactive: pick first (highest-priority plugin) - let (source, label, _) = &found[0]; - println!("Auto-selected: {}", label); - Ok(source.clone()) + // Non-interactive: pick first (highest-priority, exact-name match) + let opt = &found[0]; + println!("Auto-selected: {}", opt.label); + Ok(opt.plugin_name.clone()) } _ => { - let items: Vec<&str> = found.iter().map(|(_, label, _)| label.as_str()).collect(); + // Show all options with clear labeling + println!(); + println!(" Found '{}' in {} options:", package, found.len()); + println!(); + + let items: Vec = found.iter().map(|o| o.label.clone()).collect(); + let item_refs: Vec<&str> = items.iter().map(|s| s.as_str()).collect(); let selection = dialoguer::Select::with_theme(&dialoguer::theme::ColorfulTheme::default()) - .with_prompt(format!( - "Found '{}' in {} sources. Select one", - package, - found.len() - )) - .items(&items) + .with_prompt("Select package to install") + .items(&item_refs) .default(0) .interact() .map_err(|e| ZlError::Plugin { @@ -920,11 +1028,44 @@ fn pick_source( message: format!("Selection cancelled: {}", e), })?; - Ok(found[selection].0.clone()) + Ok(found[selection].plugin_name.clone()) } } } +/// Format a human-readable label for an install option. +fn format_option_label(candidate: &PackageCandidate) -> String { + let type_tag = if candidate.name.ends_with("-bin") + || candidate.name.ends_with("-appimage") + || candidate.name.ends_with("-prebuilt") + { + " [binary]" + } else if candidate.source == "aur" { + " [source]" + } else if candidate.source == "github" { + " [release]" + } else { + "" + }; + + let desc = if candidate.description.is_empty() { + String::new() + } else { + let clean: String = candidate + .description + .trim_start_matches("[binary] ") + .chars() + .take(50) + .collect(); + format!(" - {}", clean) + }; + + format!( + "{} {} [{}]{}{}", + candidate.name, candidate.version, candidate.source, type_tag, desc + ) +} + fn is_executable(path: &Path) -> bool { use std::os::unix::fs::PermissionsExt; std::fs::metadata(path) diff --git a/src/cli/mod.rs b/src/cli/mod.rs index bd65bf2..4e9370c 100644 --- a/src/cli/mod.rs +++ b/src/cli/mod.rs @@ -13,7 +13,7 @@ pub mod selfupdate; pub mod update; pub mod upgrade; -use clap::{ArgAction, Args, Parser, Subcommand}; +use clap::{ArgAction, Args, Parser, Subcommand, ValueEnum}; use clap_complete::Shell; use crate::core::db::ops::ZlDatabase; @@ -21,6 +21,14 @@ use crate::paths::ZlPaths; use crate::plugin::PluginRegistry; use crate::system::SystemProfile; +#[derive(Debug, Clone, Copy, Default, ValueEnum)] +pub enum SortOrder { + #[default] + Relevance, + Name, + Version, +} + /// Shared application context passed to all command handlers. pub struct AppContext<'a> { pub paths: &'a ZlPaths, @@ -157,6 +165,12 @@ pub struct SearchArgs { /// Maximum results per source (default: 20) #[arg(long)] pub limit: Option, + /// Sort results: relevance (default), name, version + #[arg(long, default_value = "relevance")] + pub sort: SortOrder, + /// Only show exact name matches + #[arg(long)] + pub exact: bool, } #[derive(Args)] diff --git a/src/cli/remove.rs b/src/cli/remove.rs index 5ad11ab..00f7707 100644 --- a/src/cli/remove.rs +++ b/src/cli/remove.rs @@ -37,6 +37,12 @@ pub fn handle(args: RemoveArgs, ctx: &AppContext) -> ZlResult<()> { println!(" - {}", v.id.version); } println!("\nRemoving all versions..."); + + if args.cascade { + // Preview cascade before any removal + preview_cascade_plan(&args.package, db)?; + } + if dry_run { println!( "[DRY-RUN] Would remove {} version(s) of {}. No changes made.", @@ -49,14 +55,14 @@ pub fn handle(args: RemoveArgs, ctx: &AppContext) -> ZlResult<()> { remove_single(v, paths, db)?; } if args.cascade { - remove_orphans(paths, db)?; + remove_orphans(paths, db, dry_run)?; } return Ok(()); } let pkg_key = format!("{}-{}", node.id.name, node.id.version); - // 2. Confirm + // 2. Show what will happen println!( "Package: {}-{} ({} files)", node.id.name, @@ -64,6 +70,10 @@ pub fn handle(args: RemoveArgs, ctx: &AppContext) -> ZlResult<()> { node.installed_files.len() ); + if args.cascade { + preview_cascade_plan(&args.package, db)?; + } + if dry_run { println!( "[DRY-RUN] Would remove {}-{}. No changes made.", @@ -73,7 +83,12 @@ pub fn handle(args: RemoveArgs, ctx: &AppContext) -> ZlResult<()> { } if !auto_yes { - print!("Remove this package? [Y/n] "); + let prompt = if args.cascade { + "Remove this package and its orphaned dependencies? [Y/n] " + } else { + "Remove this package? [Y/n] " + }; + print!("{}", prompt); use std::io::Write; std::io::stdout().flush()?; let mut input = String::new(); @@ -114,7 +129,7 @@ pub fn handle(args: RemoveArgs, ctx: &AppContext) -> ZlResult<()> { // 7. Cascade: remove orphans if requested if args.cascade { - remove_orphans(paths, db)?; + remove_orphans(paths, db, dry_run)?; } Ok(()) @@ -141,6 +156,10 @@ fn remove_specific_version( node.installed_files.len() ); + if cascade { + preview_cascade_plan(name, db)?; + } + if dry_run { println!( "[DRY-RUN] Would remove {}-{}. No changes made.", @@ -181,7 +200,7 @@ fn remove_specific_version( } if cascade { - remove_orphans(paths, db)?; + remove_orphans(paths, db, dry_run)?; } Ok(()) @@ -251,18 +270,181 @@ fn remove_bin_symlinks( Ok(()) } -/// Find and remove orphaned packages (dependencies not needed by any remaining package) -fn remove_orphans(paths: &ZlPaths, db: &ZlDatabase) -> ZlResult<()> { +/// Preview what `--cascade` would remove BEFORE actually doing anything. +/// Shows orphans that would be removed and dependencies that are kept (shared). +fn preview_cascade_plan(removing_name: &str, db: &ZlDatabase) -> ZlResult<()> { + let all_packages = db.list_packages()?; + + // Simulate: what packages become orphaned if we remove `removing_name`? + let (orphans, kept) = find_orphans_after_removal(removing_name, &all_packages, db)?; + + if !orphans.is_empty() { + println!("\n Cascade will also remove (ZL-only dependencies):"); + for orphan in &orphans { + println!(" - {}-{} [{}]", orphan.0, orphan.1, orphan.2); + } + } + + if !kept.is_empty() { + println!("\n Keeping (shared with other packages):"); + for (name, version, needed_by) in &kept { + println!(" - {}-{} (needed by {})", name, version, needed_by); + } + } + + if orphans.is_empty() { + println!("\n Cascade: no orphaned dependencies to remove."); + } + + Ok(()) +} + +/// Determine which packages would become orphaned if `removing_name` is removed. +/// Returns (orphans, kept) where: +/// - orphans: Vec<(name, version, source)> — will be removed +/// - kept: Vec<(name, version, needed_by)> — shared deps that stay +#[allow(clippy::type_complexity)] +fn find_orphans_after_removal( + removing_name: &str, + all_packages: &[crate::core::graph::model::PackageNode], + db: &ZlDatabase, +) -> ZlResult<(Vec<(String, String, String)>, Vec<(String, String, String)>)> { + let mut orphans = Vec::new(); + let mut kept = Vec::new(); + + for pkg in all_packages { + // Only consider implicit (non-explicit) packages as cascade candidates + if pkg.explicit || pkg.id.name == removing_name { + continue; + } + + let pkg_name = &pkg.id.name; + + // Check if any REMAINING package (other than the one being removed) depends on this + let has_other_dependents = all_packages.iter().any(|other| { + if other.id.name == *pkg_name || other.id.name == removing_name { + return false; + } + let other_key = format!("{}-{}", other.id.name, other.id.version); + db.get_dependencies(&other_key) + .unwrap_or_default() + .iter() + .any(|dep| { + let dep_name = dep.split(&['>', '<', '=', ':'][..]).next().unwrap_or(dep); + dep_name == pkg_name + }) + }); + + // Also check shared lib dependencies + let provides: std::collections::HashSet<&str> = + pkg.provides_libs.keys().map(|s| s.as_str()).collect(); + let has_lib_dependents = all_packages.iter().any(|other| { + other.id.name != *pkg_name + && other.id.name != removing_name + && other + .needs_libs + .iter() + .any(|lib| provides.contains(lib.as_str())) + }); + + if has_other_dependents || has_lib_dependents { + // Find who needs it (for the display message) + let needed_by = all_packages + .iter() + .filter(|other| other.id.name != *pkg_name && other.id.name != removing_name) + .find(|other| { + let other_key = format!("{}-{}", other.id.name, other.id.version); + let has_dep = db + .get_dependencies(&other_key) + .unwrap_or_default() + .iter() + .any(|dep| { + let dep_name = + dep.split(&['>', '<', '=', ':'][..]).next().unwrap_or(dep); + dep_name == pkg_name + }); + let has_lib = other + .needs_libs + .iter() + .any(|lib| provides.contains(lib.as_str())); + has_dep || has_lib + }) + .map(|p| p.id.name.clone()) + .unwrap_or_else(|| "unknown".into()); + + kept.push((pkg.id.name.clone(), pkg.id.version.clone(), needed_by)); + } else { + // Check if this package is actually a dependency of the one being removed + let is_dep_of_removing = all_packages + .iter() + .filter(|p| p.id.name == removing_name) + .any(|p| { + let key = format!("{}-{}", p.id.name, p.id.version); + db.get_dependencies(&key) + .unwrap_or_default() + .iter() + .any(|dep| { + let dep_name = + dep.split(&['>', '<', '=', ':'][..]).next().unwrap_or(dep); + dep_name == pkg_name + }) + }); + + // Also consider transitive: a dep of a dep + let is_dep_of_removing = is_dep_of_removing || { + // Check if any already-identified orphan depends on this + // (simplified: just check if this package has no other dependents at all) + !all_packages.iter().any(|other| { + other.id.name != *pkg_name + && other.id.name != removing_name + && other.explicit + && { + let other_key = format!("{}-{}", other.id.name, other.id.version); + db.get_dependencies(&other_key) + .unwrap_or_default() + .iter() + .any(|dep| { + let dep_name = + dep.split(&['>', '<', '=', ':'][..]).next().unwrap_or(dep); + dep_name == pkg_name + }) + } + }) + }; + + if is_dep_of_removing { + orphans.push(( + pkg.id.name.clone(), + pkg.id.version.clone(), + pkg.id.source.clone(), + )); + } + } + } + + Ok((orphans, kept)) +} + +/// Find and remove orphaned packages (dependencies not needed by any remaining package). +/// +/// Safety guarantees: +/// - ONLY removes packages tracked in ZL's own database (never system packages) +/// - ONLY removes packages marked as `explicit: false` (installed as dependencies) +/// - Checks BOTH declared dependencies AND shared library needs +/// - Always prints what will be removed before doing it +fn remove_orphans(paths: &ZlPaths, db: &ZlDatabase, dry_run: bool) -> ZlResult<()> { let all_packages = db.list_packages()?; - // Find packages that are not explicit and not depended on by any remaining package. - // Check both the dependency table and the shared lib needs. + // Find packages that are: + // 1. NOT explicit (were installed as dependencies by ZL) + // 2. NOT depended on by any remaining explicit or non-orphan package let orphans: Vec<_> = all_packages .iter() .filter(|pkg| !pkg.explicit) .filter(|pkg| { - // Check if any remaining package has a registered dependency on this one let pkg_name = &pkg.id.name; + + // Check if any remaining package has a registered dependency on this one let has_dependents = all_packages.iter().any(|other| { if other.id.name == *pkg_name { return false; @@ -282,7 +464,7 @@ fn remove_orphans(paths: &ZlPaths, db: &ZlDatabase) -> ZlResult<()> { return false; } - // Also check shared lib dependencies (legacy/fallback) + // Also check shared lib dependencies (fallback) let provides: std::collections::HashSet<&str> = pkg.provides_libs.keys().map(|s| s.as_str()).collect(); !all_packages.iter().any(|other| { @@ -299,10 +481,28 @@ fn remove_orphans(paths: &ZlPaths, db: &ZlDatabase) -> ZlResult<()> { return Ok(()); } - println!("\nRemoving {} orphaned dependencies:", orphans.len()); + println!("\nOrphaned dependencies to remove ({}):", orphans.len()); + for orphan in &orphans { + println!( + " - {}-{} [{}]", + orphan.id.name, orphan.id.version, orphan.id.source + ); + } + + if dry_run { + println!( + "\n[DRY-RUN] Would remove {} orphaned dependency(ies). No changes made.", + orphans.len() + ); + return Ok(()); + } + + println!(); for orphan in &orphans { let pkg_key = format!("{}-{}", orphan.id.name, orphan.id.version); - println!(" - {}-{}", orphan.id.name, orphan.id.version); + + // Remove bin symlinks + remove_bin_symlinks(&orphan.installed_files, &paths.bin)?; // Remove lib symlinks for soname in orphan.provides_libs.keys() { @@ -324,6 +524,8 @@ fn remove_orphans(paths: &ZlPaths, db: &ZlDatabase) -> ZlResult<()> { db.remove_files_for_package(&pkg_key)?; db.remove_dependencies(&pkg_key)?; db.remove_package(&orphan.id.name, &orphan.id.version)?; + + println!(" Removed orphan: {}-{}", orphan.id.name, orphan.id.version); } Ok(()) diff --git a/src/cli/search.rs b/src/cli/search.rs index f5e883e..a7fa0d7 100644 --- a/src/cli/search.rs +++ b/src/cli/search.rs @@ -1,11 +1,89 @@ +use std::sync::Mutex; + use crate::error::ZlResult; -use crate::plugin::PluginRegistry; +use crate::plugin::{PackageCandidate, PluginRegistry, SourcePlugin}; -use super::SearchArgs; +use super::{SearchArgs, SortOrder}; /// Maximum results shown per source. Use --limit to override. const DEFAULT_LIMIT: usize = 20; +/// A search result with relevance scoring +struct ScoredResult { + candidate: PackageCandidate, + score: u32, + tag: &'static str, +} + +/// Compute a relevance score for a candidate against the query. +/// +/// Higher = more relevant: +/// - 100: exact name match +/// - 80: name starts with query +/// - 60: name contains query +/// - 30: description contains query +/// - 10: fallback (matched by plugin but not by our heuristics) +fn score_candidate(candidate: &PackageCandidate, query: &str) -> (u32, &'static str) { + let name = candidate.name.to_lowercase(); + let q = query.to_lowercase(); + + if name == q { + (100, "exact") + } else if name.starts_with(&q) { + (80, "name") + } else if name.contains(&q) { + (60, "name") + } else if candidate.description.to_lowercase().contains(&q) { + (30, "desc") + } else { + (10, "") + } +} + +/// Search all plugins in parallel using thread::scope. +fn search_parallel<'a>( + plugins: &[&'a dyn SourcePlugin], + query: &str, +) -> Vec<(&'a dyn SourcePlugin, Vec)> { + let results: Mutex)>> = + Mutex::new(Vec::with_capacity(plugins.len())); + + std::thread::scope(|scope| { + let results = &results; + let mut handles = Vec::new(); + + for &plugin in plugins { + handles.push(scope.spawn(move || { + // Sync before searching + if let Err(e) = plugin.sync() { + tracing::warn!("Failed to sync {}: {}", plugin.name(), e); + return; + } + + match plugin.search(query) { + Ok(candidates) => { + if !candidates.is_empty() { + let mut r = results.lock().unwrap(); + r.push((plugin, candidates)); + } + } + Err(e) => { + tracing::warn!("Search failed for {}: {}", plugin.name(), e); + } + } + })); + } + + for handle in handles { + if handle.join().is_err() { + tracing::warn!("A search thread panicked"); + } + } + }); + + results.into_inner().unwrap_or_default() +} + pub fn handle(args: SearchArgs, registry: &PluginRegistry) -> ZlResult<()> { let plugins = match args.from.as_deref() { Some(name) => match registry.get(name) { @@ -19,50 +97,219 @@ pub fn handle(args: SearchArgs, registry: &PluginRegistry) -> ZlResult<()> { }; let limit = args.limit.unwrap_or(DEFAULT_LIMIT); - let mut total = 0; - for plugin in plugins { - // Sync before searching - if let Err(e) = plugin.sync() { - tracing::warn!("Failed to sync {}: {}", plugin.name(), e); - continue; - } + // Search all plugins in parallel + let all_results = search_parallel(&plugins, &args.query); - match plugin.search(&args.query) { - Ok(results) => { - if results.is_empty() { - continue; - } + if all_results.is_empty() { + println!("No packages found for '{}'.", args.query); + return Ok(()); + } - let shown = results.len().min(limit); - println!("── {} ──", plugin.display_name()); - for candidate in results.iter().take(limit) { - println!( - " {:<30} {:<15} {}", - candidate.name, candidate.version, candidate.description - ); - } - if results.len() > limit { - println!( - " ... and {} more (use --limit N or --from {} to narrow down)", - results.len() - limit, - plugin.name() - ); + let mut total_shown = 0; + + for (plugin, candidates) in &all_results { + // Score and filter results + let mut scored: Vec = candidates + .iter() + .map(|c| { + let (score, tag) = score_candidate(c, &args.query); + ScoredResult { + candidate: c.clone(), + score, + tag, } - println!(); - total += shown; + }) + .collect(); + + // If --exact, keep only exact name matches + if args.exact { + scored.retain(|s| s.score == 100); + } + + if scored.is_empty() { + continue; + } + + // Sort by requested order + match args.sort { + SortOrder::Relevance => scored.sort_by(|a, b| b.score.cmp(&a.score)), + SortOrder::Name => { + scored.sort_by(|a, b| a.candidate.name.cmp(&b.candidate.name)); } - Err(e) => { - tracing::warn!("Search failed for {}: {}", plugin.name(), e); + SortOrder::Version => { + scored.sort_by(|a, b| b.candidate.version.cmp(&a.candidate.version)); } } + + let total_count = scored.len(); + let shown = total_count.min(limit); + + println!( + "── {} ({} result{}) ──", + plugin.display_name(), + total_count, + if total_count == 1 { "" } else { "s" } + ); + + for entry in scored.iter().take(limit) { + let tag_str = if entry.tag.is_empty() { + String::new() + } else { + format!(" [{}]", entry.tag) + }; + + // Truncate description to 55 chars + let desc: String = if entry.candidate.description.len() > 55 { + format!("{}...", &entry.candidate.description[..52]) + } else { + entry.candidate.description.clone() + }; + + println!( + " {:<30} {:<15} {}{}", + entry.candidate.name, entry.candidate.version, desc, tag_str + ); + } + + if total_count > limit { + println!( + " ... and {} more (use --limit {} or --from {} to see all)", + total_count - limit, + total_count, + plugin.name() + ); + } + println!(); + total_shown += shown; } - if total == 0 { - println!("No packages found for '{}'.", args.query); + if total_shown == 0 { + if args.exact { + println!( + "No exact matches for '{}'. Try without --exact.", + args.query + ); + } else { + println!("No packages found for '{}'.", args.query); + } } else { - println!("{} result(s) shown.", total); + println!( + "{} result(s) shown across {} source(s).", + total_shown, + all_results.len() + ); + if !args.exact { + println!( + "Tip: use `zl search {} --exact` for exact matches only.", + args.query + ); + } } Ok(()) } + +#[cfg(test)] +mod tests { + use super::*; + + #[test] + fn test_score_exact_match() { + let candidate = PackageCandidate { + name: "firefox".into(), + version: "120.0".into(), + description: "Web browser".into(), + arch: "x86_64".into(), + source: "pacman".into(), + dependencies: vec![], + provides: vec![], + conflicts: vec![], + installed_size: 0, + download_url: String::new(), + checksum: None, + }; + let (score, tag) = score_candidate(&candidate, "firefox"); + assert_eq!(score, 100); + assert_eq!(tag, "exact"); + } + + #[test] + fn test_score_starts_with() { + let candidate = PackageCandidate { + name: "firefox-esr".into(), + version: "120.0".into(), + description: "Web browser ESR".into(), + arch: "x86_64".into(), + source: "pacman".into(), + dependencies: vec![], + provides: vec![], + conflicts: vec![], + installed_size: 0, + download_url: String::new(), + checksum: None, + }; + let (score, tag) = score_candidate(&candidate, "firefox"); + assert_eq!(score, 80); + assert_eq!(tag, "name"); + } + + #[test] + fn test_score_contains_in_name() { + let candidate = PackageCandidate { + name: "lib32-firefox".into(), + version: "1.0".into(), + description: "32-bit compat".into(), + arch: "x86_64".into(), + source: "pacman".into(), + dependencies: vec![], + provides: vec![], + conflicts: vec![], + installed_size: 0, + download_url: String::new(), + checksum: None, + }; + let (score, tag) = score_candidate(&candidate, "firefox"); + assert_eq!(score, 60); + assert_eq!(tag, "name"); + } + + #[test] + fn test_score_description_only() { + let candidate = PackageCandidate { + name: "iceweasel".into(), + version: "1.0".into(), + description: "Rebranded firefox web browser".into(), + arch: "x86_64".into(), + source: "apt".into(), + dependencies: vec![], + provides: vec![], + conflicts: vec![], + installed_size: 0, + download_url: String::new(), + checksum: None, + }; + let (score, tag) = score_candidate(&candidate, "firefox"); + assert_eq!(score, 30); + assert_eq!(tag, "desc"); + } + + #[test] + fn test_score_case_insensitive() { + let candidate = PackageCandidate { + name: "Firefox".into(), + version: "1.0".into(), + description: String::new(), + arch: "x86_64".into(), + source: "test".into(), + dependencies: vec![], + provides: vec![], + conflicts: vec![], + installed_size: 0, + download_url: String::new(), + checksum: None, + }; + let (score, _) = score_candidate(&candidate, "firefox"); + assert_eq!(score, 100); + } +} diff --git a/src/core/elf/analysis.rs b/src/core/elf/analysis.rs index 38cfeab..e7a5b44 100644 --- a/src/core/elf/analysis.rs +++ b/src/core/elf/analysis.rs @@ -121,6 +121,54 @@ pub fn scan_directory(dir: &Path) -> ZlResult> { Ok(results) } +/// Map ELF e_machine value to our Arch enum. +/// Returns None for unknown/unsupported machine types. +pub fn elf_machine_to_arch(machine: u16) -> Option { + use crate::system::arch::Arch; + use goblin::elf::header; + match machine { + header::EM_X86_64 => Some(Arch::X86_64), + header::EM_AARCH64 => Some(Arch::Aarch64), + header::EM_ARM => Some(Arch::Armv7), + header::EM_RISCV => Some(Arch::Riscv64), // could be 32-bit riscv but we assume 64 + header::EM_386 => Some(Arch::I686), + header::EM_S390 => Some(Arch::S390x), + header::EM_PPC64 => Some(Arch::Ppc64le), + header::EM_MIPS => Some(Arch::Mips64), + _ => None, + } +} + +/// Check if an ELF binary's architecture is compatible with the host system. +/// Returns Ok(()) if compatible, or a descriptive error message if not. +pub fn check_arch_compat( + info: &ElfInfo, + host_arch: &crate::system::arch::Arch, +) -> Result<(), String> { + let elf_arch = match elf_machine_to_arch(info.machine) { + Some(a) => a, + None => return Ok(()), // Unknown machine type — skip check + }; + + if elf_arch == *host_arch { + return Ok(()); + } + + // Allow i686 binaries on x86_64 (multilib compat) + if *host_arch == crate::system::arch::Arch::X86_64 + && elf_arch == crate::system::arch::Arch::I686 + { + return Ok(()); + } + + Err(format!( + "Binary {} is built for {} but your system is {}", + info.path.display(), + elf_arch, + host_arch + )) +} + #[cfg(test)] mod tests { use super::*; diff --git a/src/error.rs b/src/error.rs index c88716f..614a9b4 100644 --- a/src/error.rs +++ b/src/error.rs @@ -87,6 +87,13 @@ pub enum ZlError { #[error("Verification failed:\n{0}")] Verification(String), + // ── Architecture ── + #[allow(dead_code)] + #[error( + "Architecture mismatch: package is built for {pkg_arch} but your system is {host_arch}" + )] + ArchMismatch { pkg_arch: String, host_arch: String }, + // ── Environments ── #[error("Environment error: {0}")] Environment(String), @@ -99,6 +106,21 @@ impl ZlError { ZlError::PackageNotFound { .. } => { Some("Check the package name or try a different source with --from") } + ZlError::DownloadFailed { url, .. } if url.contains("archlinux.org") => Some( + "Mirror sync failed. Check /etc/pacman.d/mirrorlist or your internet connection", + ), + ZlError::DownloadFailed { url, .. } + if url.contains("debian.org") || url.contains("ubuntu.com") => + { + Some("Failed to fetch from APT repo. Check the repository URL in your config") + } + ZlError::DownloadFailed { url, .. } + if url.contains("github.com") || url.contains("api.github.com") => + { + Some( + "GitHub download failed. You may be rate-limited — set GITHUB_TOKEN env var or wait", + ) + } ZlError::DownloadFailed { .. } => { Some("Check your internet connection or try again later") } @@ -106,21 +128,67 @@ impl ZlError { Some("The server may be slow — try again or use a different mirror") } ZlError::ChecksumMismatch { .. } => { - Some("The downloaded file is corrupted — delete the cache and try again") + Some("The downloaded file is corrupted — run `zl cache clean` and try again") + } + ZlError::BuildToolMissing { tool } if tool == "git" => { + Some("Install git: sudo pacman -S git (Arch) or sudo apt install git (Debian)") + } + ZlError::BuildToolMissing { tool } if tool == "makepkg" => { + Some("makepkg is part of pacman. On non-Arch systems, AUR builds are not supported") } ZlError::BuildToolMissing { .. } => { Some("Install the required build tool with your system package manager") } + ZlError::BuildFailed { message, .. } + if message.contains("base-devel") || message.contains("fakeroot") => + { + Some("Install base-devel: sudo pacman -S --needed base-devel") + } + ZlError::BuildFailed { message, .. } + if message.contains("PGP") || message.contains("signature") => + { + Some("A PGP key is missing. Import it or rebuild with --skippgpcheck") + } + ZlError::BuildFailed { .. } => { + Some("Check the PKGBUILD for errors or missing build dependencies") + } ZlError::PackageConflict { .. } => { Some("Remove the conflicting package first with `zl remove`") } + ZlError::Plugin { plugin, message } if plugin == "aur" && message.contains("HTTP") => { + Some( + "AUR API returned an error. The package name may be incorrect or AUR may be down", + ) + } + ZlError::Plugin { plugin, message } + if plugin == "github" && message.contains("rate") => + { + Some( + "GitHub API rate limit reached. Set GITHUB_TOKEN env var to increase the limit", + ) + } + ZlError::Plugin { plugin, .. } if plugin == "github" => Some( + "GitHub packages require owner/repo format (e.g., `zl install BurntSushi/ripgrep --from github`)", + ), + ZlError::Plugin { .. } => None, ZlError::GpgVerification { .. } => Some( "The package signature is invalid — this may indicate tampering. Use --skip-verify to bypass (not recommended)", ), - ZlError::SelfUpdate(msg) if msg.contains("Permission denied") => { + ZlError::ArchMismatch { .. } => Some( + "This package was built for a different CPU architecture and cannot run on your system", + ), + ZlError::SelfUpdate(msg) + if msg.contains("Permission denied") || msg.contains("not writable") => + { Some("Run with elevated permissions: sudo zl self-update") } + ZlError::SelfUpdate(msg) if msg.contains("No binary found") => Some( + "No prebuilt binary for your architecture. Build from source: cargo install --git https://github.com/supercosti21/zero_layer", + ), ZlError::SelfUpdate(_) => Some("Check your internet connection and try again"), + ZlError::Archive(_) => { + Some("The archive may be corrupted. Run `zl cache clean` and try again") + } _ => None, } } diff --git a/src/plugin/aur/mod.rs b/src/plugin/aur/mod.rs index f19d6c0..6675cf8 100644 --- a/src/plugin/aur/mod.rs +++ b/src/plugin/aur/mod.rs @@ -3,6 +3,9 @@ //! Searches the AUR via its JSON RPC API v5, then builds packages locally //! using `git` + `makepkg` (requires `base-devel` group installed). //! +//! When searching, also discovers `-bin` variants (precompiled binaries) +//! so users can choose between building from source or using a binary. +//! //! Usage: zl install yay --from aur //! zl search rofi-wayland --from aur @@ -15,6 +18,9 @@ use crate::plugin::{ExtractedPackage, PackageCandidate, SourcePlugin}; const AUR_RPC: &str = "https://aur.archlinux.org/rpc/v5"; const AUR_GIT: &str = "https://aur.archlinux.org"; +/// Common suffixes for AUR binary/precompiled variants +const BIN_SUFFIXES: &[&str] = &["-bin", "-appimage", "-prebuilt"]; + // ── AUR RPC response types ──────────────────────────────────────────────────── #[derive(serde::Deserialize)] @@ -65,10 +71,18 @@ impl AurPlugin { } fn to_candidate(pkg: &AurPackage) -> PackageCandidate { + // Tag binary variants in the description so the user can tell them apart + let is_bin_variant = BIN_SUFFIXES.iter().any(|s| pkg.name.ends_with(s)); + let description = if is_bin_variant { + format!("[binary] {}", pkg.description.clone().unwrap_or_default()) + } else { + pkg.description.clone().unwrap_or_default() + }; + PackageCandidate { name: pkg.name.clone(), version: pkg.version.clone(), - description: pkg.description.clone().unwrap_or_default(), + description, arch: "any".to_string(), // set correctly after build by .PKGINFO source: "aur".to_string(), dependencies: pkg.depends.clone(), @@ -104,6 +118,25 @@ impl AurPlugin { message: format!("Failed to parse AUR response: {}", e), }) } + + /// Fetch info for specific package names via the AUR multiinfo endpoint. + /// Returns candidates for all names that exist on AUR. + fn fetch_info_multi(&self, names: &[String]) -> ZlResult> { + if names.is_empty() { + return Ok(vec![]); + } + // AUR RPC v5 info endpoint accepts multiple args: /info/{name1},{name2},... + // But the standard way is multiple &arg[]= params + let mut url = format!("{}/info?", AUR_RPC); + for (i, name) in names.iter().enumerate() { + if i > 0 { + url.push('&'); + } + url.push_str(&format!("arg[]={}", name)); + } + let resp = self.fetch_rpc(&url)?; + Ok(resp.results.iter().map(Self::to_candidate).collect()) + } } // ── SourcePlugin implementation ─────────────────────────────────────────────── @@ -127,7 +160,31 @@ impl SourcePlugin for AurPlugin { fn search(&self, query: &str) -> ZlResult> { let url = format!("{}/search/{}?by=name-desc", AUR_RPC, query); let resp = self.fetch_rpc(&url)?; - Ok(resp.results.iter().map(Self::to_candidate).collect()) + let mut results: Vec = + resp.results.iter().map(Self::to_candidate).collect(); + + // If query doesn't already end with a binary suffix, also look up -bin variants + let already_has_bin_suffix = BIN_SUFFIXES.iter().any(|s| query.ends_with(s)); + if !already_has_bin_suffix { + let bin_names: Vec = BIN_SUFFIXES + .iter() + .map(|s| format!("{}{}", query, s)) + .collect(); + + // Only fetch variants that aren't already in the results + let existing_names: std::collections::HashSet<&str> = + results.iter().map(|r| r.name.as_str()).collect(); + let missing: Vec = bin_names + .into_iter() + .filter(|n| !existing_names.contains(n.as_str())) + .collect(); + + if let Ok(bin_results) = self.fetch_info_multi(&missing) { + results.extend(bin_results); + } + } + + Ok(results) } fn resolve(&self, name: &str, version: Option<&str>) -> ZlResult> { @@ -158,23 +215,28 @@ impl SourcePlugin for AurPlugin { candidate.download_url ); - let clone_status = std::process::Command::new("git") + let clone_output = std::process::Command::new("git") .args(["clone", "--depth=1"]) .arg(&candidate.download_url) .arg(&clone_dir) .env("GIT_TERMINAL_PROMPT", "0") - .status()?; + .output()?; - if !clone_status.success() { + if !clone_output.status.success() { + let stderr = String::from_utf8_lossy(&clone_output.stderr); return Err(ZlError::Plugin { plugin: "aur".into(), - message: format!("git clone failed for {}", candidate.download_url), + message: format!( + "git clone failed for {}:\n {}", + candidate.download_url, + stderr.trim() + ), }); } - tracing::info!("Building {} with makepkg…", candidate.name); + tracing::info!("Building {} with makepkg...", candidate.name); - let build_status = std::process::Command::new("makepkg") + let build_output = std::process::Command::new("makepkg") .args([ "--syncdeps", // install build deps via pacman "--force", // overwrite existing pkg file @@ -182,12 +244,26 @@ impl SourcePlugin for AurPlugin { "--noprogressbar", ]) .current_dir(&clone_dir) - .status()?; - - if !build_status.success() { + .output()?; + + if !build_output.status.success() { + let stderr = String::from_utf8_lossy(&build_output.stderr); + let hint = if stderr.contains("base-devel") || stderr.contains("fakeroot") { + "\n hint: install base-devel: sudo pacman -S --needed base-devel" + } else if stderr.contains("PGP") || stderr.contains("signature") { + "\n hint: import the PGP key or use makepkg with --skippgpcheck" + } else if stderr.contains("dependency") { + "\n hint: a build dependency could not be installed — check the PKGBUILD" + } else { + "" + }; return Err(ZlError::BuildFailed { package: candidate.name.clone(), - message: "makepkg failed — check PKGBUILD or install base-devel".into(), + message: format!( + "makepkg failed:\n {}{}", + stderr.lines().take(5).collect::>().join("\n "), + hint + ), }); } @@ -276,4 +352,35 @@ mod tests { assert_eq!(c.source, "aur"); assert!(c.download_url.contains("aur.archlinux.org")); } + + #[test] + fn test_to_candidate_bin_tagged() { + let pkg = AurPackage { + name: "yay-bin".into(), + package_base: "yay-bin".into(), + version: "12.3.5-1".into(), + description: Some("AUR helper (prebuilt)".into()), + depends: vec![], + conflicts: vec![], + provides: vec!["yay".into()], + }; + let c = AurPlugin::to_candidate(&pkg); + assert_eq!(c.name, "yay-bin"); + assert!(c.description.starts_with("[binary]")); + } + + #[test] + fn test_to_candidate_source_not_tagged() { + let pkg = AurPackage { + name: "yay".into(), + package_base: "yay".into(), + version: "12.3.5-1".into(), + description: Some("AUR helper".into()), + depends: vec!["go".into()], + conflicts: vec![], + provides: vec![], + }; + let c = AurPlugin::to_candidate(&pkg); + assert!(!c.description.contains("[binary]")); + } } From 1b136d40fdd63b6670af5158f8964abe34292ff5 Mon Sep 17 00:00:00 2001 From: Claude Date: Mon, 23 Feb 2026 10:14:41 +0000 Subject: [PATCH 5/5] docs: add development workflow rules to CLAUDE.md and README.md - Feature branches only, never work on main directly - Branch naming: feat/, fix/, chore/, refactor/, docs/ - Merge only when CI passes, delete branch after - Atomic commits: 1 commit = 1 concept - Commit format: type: title + bullet points - Documentation: update CLAUDE.md + README.md after changes https://claude.ai/code/session_01W29EWLxxuZuRnfP9pJjxuB --- CLAUDE.md | 12 ++++++++++-- README.md | 11 ++++++++++- 2 files changed, 20 insertions(+), 3 deletions(-) diff --git a/CLAUDE.md b/CLAUDE.md index 1be8768..1eb5222 100644 --- a/CLAUDE.md +++ b/CLAUDE.md @@ -4,8 +4,16 @@ This file provides guidance to Claude Code (claude.ai/code) when working with co ## Rules -- **Every time you make changes to the codebase**, update this CLAUDE.md file to reflect the new state (implementation status, module structure, known issues, etc.). -- **If changes affect user-facing features, architecture, or usage**, also update README.md accordingly. +These rules are **mandatory** for every Claude instance working on this repo. + +### Git workflow + +1. **Feature branches** — NEVER work directly on `main`. Always create a branch first. +2. **Branch naming** — Use prefixed names: `feat/xxx`, `fix/xxx`, `chore/xxx`, `refactor/xxx`, `docs/xxx`. +3. **Merge** — Only merge to `main` when everything works (tests pass, clippy clean, fmt clean). Delete the branch after merge. +6. **Atomic commits** — 1 commit = 1 concept. Better 3 small focused commits than 1 giant commit. Each commit should be self-contained and pass CI on its own. +7. **Commit message format** — `type: clear title` where type is `feat`, `fix`, `chore`, `refactor`, or `docs`. Add bullet points in the body for details when needed. +8. **Documentation** — After every significant change, update `CLAUDE.md` first (implementation state, module structure, test count), then `README.md` if user-facing features changed. ## Project Overview diff --git a/README.md b/README.md index 100e547..59b5fcc 100644 --- a/README.md +++ b/README.md @@ -562,12 +562,21 @@ repos = ["core", "extra"] ```bash cargo build # Build -cargo test # Run all tests (79 tests) +cargo test # Run all tests (195 tests: 103 bin + 92 lib) cargo test # Run a single test cargo clippy # Lint cargo fmt # Format ``` +### Development Workflow Rules + +1. **Feature branches** — Never work directly on `main`. Always create a branch first. +2. **Branch naming** — Use prefixed names: `feat/xxx`, `fix/xxx`, `chore/xxx`, `refactor/xxx`, `docs/xxx`. +3. **Merge** — Only merge to `main` when everything works (tests pass, clippy clean, fmt clean). Delete the branch after merge. +4. **Atomic commits** — 1 commit = 1 concept. Better 3 small focused commits than 1 giant commit. +5. **Commit message format** — `type: clear title` + bullet points for details. Types: `feat`, `fix`, `chore`, `refactor`, `docs`. +6. **Documentation** — After every significant change, update `CLAUDE.md` (implementation state, module structure, test count), then `README.md` if user-facing features changed. + ## License GPL v3 — see [LICENSE](LICENSE) for details.