Skip to content

maikbasel/form-forge

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

416 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Form Forge

Backend CI Frontend CI Playwright Tests

A PDF form processing application that enables non-technical D&D 5e players to add dynamic calculations to their PDF character sheets without writing JavaScript or understanding PDF AcroForm internals.

Table of Contents

What It Does

Users upload a PDF character sheet, the application extracts form fields, and provides a visual interface to map calculations between fields (e.g., Strength modifier = (Strength score - 10) / 2). The application generates the necessary JavaScript and embeds it into the PDF AcroForm structure.

Form Forge ships in two distribution modes:

  • Web: A Next.js frontend paired with a Rust API server, deployed via Docker Compose. Uses PostgreSQL for persistence and S3-compatible storage for PDF files.
  • Native: A Tauri desktop application that bundles the Rust backend locally. Uses embedded libSQL and filesystem storage — no external services required.

Tech Stack

Shared

  • Rust core crates (domain logic, PDF processing via lopdf)
  • Hexagonal architecture with vertical slicing
  • Turborepo monorepo with pnpm workspaces
  • shadcn/ui components + Tailwind CSS
  • React 19

Web

  • Actix-Web for HTTP API
  • SQLx with PostgreSQL for persistence
  • S3-compatible object storage (RustFS)
  • Next.js (App Router)

Native

  • Tauri v2 desktop shell
  • Embedded libSQL for persistence
  • Filesystem storage for PDF files

Prerequisites

Shared

  • Rust (stable toolchain)

Recommended: Install via asdf (version manager)

The following tools are managed via asdf (see .tool-versions):

  • Node.js
  • pnpm
  • just
  • Python (required for pre-commit)
  • pre-commit
# Install asdf plugins and tools
asdf plugin add nodejs
asdf plugin add pnpm
asdf plugin add just
asdf plugin add python
asdf plugin add pre-commit

# Install all versions from .tool-versions
asdf install

Web

  • Docker + Docker Compose

Native

Setup

Web

pnpm install

# Configure environment
cp .env.example .env
# Edit .env if needed (defaults work for local development)

# Start PostgreSQL + RustFS + Adminer
just up

# Run backend API + Next.js frontend
just dev

Access points:

Native

pnpm install
cd apps/native
pnpm tauri dev

Building from Source

Building from source produces the Tauri desktop binary. Ensure you have the prerequisites installed, including the Tauri v2 system dependencies for your platform.

git clone https://github.com/maikbasel/form-forge.git
cd form-forge
pnpm install
cd apps/native
pnpm tauri build

The compiled binaries will be in apps/native/src-tauri/target/release/bundle/.

Development

Commands

Shared / Monorepo

pnpm build            # Build all packages
pnpm lint             # Lint all packages
pnpm check-types      # Type checking
pnpm exec ultracite fix     # Format/fix (Biome-based)
pnpm exec ultracite check   # Check for issues without fixing

Backend (used by both Web and Native)

cd apps/api && cargo test
cd apps/api && cargo fmt --all
cd apps/api && cargo clippy --workspace --all-targets --all-features -- -D warnings

Web

just dev              # Backend API + Next.js frontend
just up               # Start Docker infrastructure only
just down             # Stop Docker infrastructure
just be               # Run API server only
just web              # Next.js web app only

Native

just native                    # Run Tauri desktop app (dev mode)
cd apps/native && pnpm tauri dev    # Alternative
cd apps/native && pnpm tauri build  # Production build

Code Quality

  • Rust: Uses rustfmt and clippy (enforced via pre-commit hooks)
  • TypeScript: Uses Ultracite (Biome preset) for formatting and linting

Pre-commit Hooks Setup:

If you installed via asdf (recommended), pre-commit is already available. Just run:

pre-commit install

This installs git hooks that automatically run on every commit:

  • Backend: cargo fmt --check, cargo clippy, cargo check
  • Frontend: pnpm exec ultracite fix

Project Structure

crates/                     # Shared Rust crates (Shared)
  sheets_core/              # Sheets domain logic + ports
  sheets_pdf/               # PDF adapter for sheets
  actions_core/             # Actions domain logic + ports + JS helpers
  actions_pdf/              # PDF adapter for actions
  common_pdf/               # Shared PDF utilities
  common_telemetry/         # Shared telemetry

apps/
  api/                      # Rust backend — Actix-Web (Web)
    crates/
      common/               # API-specific utilities (DB config, errors)
      sheets/adapters/      # Sheets HTTP, S3, and DB adapters
      actions/adapters/     # Actions HTTP adapter
  web/                      # Next.js web application (Web)
  native/                   # Tauri desktop application (Native)
    src-tauri/crates/
      sheets_fs/            # Filesystem adapter for sheets (Native)
      sheets_libsql/        # libSQL adapter for sheets (Native)

packages/                   # Shared frontend packages (Shared)
  ui/                       # Shared React component library
  api-spec/                 # OpenAPI spec and generated API client
  typescript-config/        # Shared TypeScript configs

Architecture

The backend uses hexagonal architecture with two bounded contexts:

  • Sheets: Handles PDF form upload, storage, and field extraction
  • Actions: Handles JavaScript action generation and PDF modification

See Development Guide for detailed architecture documentation.

Testing

Backend (Web and Native)

# Run all tests (includes testcontainers for integration tests)
cd apps/api && cargo test

# Run specific test
cd apps/api && cargo test <test_name>

# Run tests for specific crate
cd apps/api && cargo test -p <crate_name>
# Example: cd apps/api && cargo test -p sheets-core

Testing stack:

  • rstest for parameterized tests
  • testcontainers for PostgreSQL integration tests
  • mockall for mocking port traits (via #[cfg_attr(test, automock)])
  • pretty_assertions for test assertions

Frontend (Web and Native)

The monorepo uses Turborepo for orchestrating tests across packages.

# Run all tests across the monorepo (using Turborepo)
pnpm test

# Run tests for specific package
pnpm --filter <package-name> test

# Examples:
# pnpm --filter web test
# pnpm --filter ui test

E2E Tests (Web only)

E2E tests use Playwright across Chromium, Firefox, and WebKit.

Local (headed browsers, requires Rust toolchain + Playwright browsers installed):

pnpm test:e2e

Docker (headless, requires only Docker):

pnpm test:e2e:docker    # Run tests
pnpm test:e2e:docker:down  # Clean up containers/volumes

# Or using just (runs + cleans up automatically):
just test-e2e-docker

Deployment

Web (Docker Compose)

The repository includes three Docker Compose configurations:

  • compose.dev.yml - Development (used by just up)
  • compose.prod.yml - Production (full stack with nginx reverse proxy)
  • compose.e2e.yml - E2E testing (full stack with Playwright runner)

Development (infrastructure only):

just up    # Start PostgreSQL + RustFS + Adminer
just down  # Stop infrastructure

Production (full stack):

# Configure environment (see .env.example)
cp .env.example .env
# Edit .env with production values

# Build and start all services
docker compose -f compose.prod.yml up --build -d

# View logs
docker compose -f compose.prod.yml logs -f

# Stop services
docker compose -f compose.prod.yml down

Production access points (behind nginx reverse proxy):

  • Web UI: https://<domain>/
  • API: https://<domain>/sheets, https://<domain>/dnd5e
  • Swagger UI: https://<domain>/swagger-ui/
  • S3 Console: https://<domain>/s3-console (redirects to RustFS admin UI, login with S3_ACCESS_KEY / S3_SECRET_KEY)

Required environment variables for production:

# Database (required)
POSTGRES_PASSWORD=your-secure-password

# S3 Storage (required)
S3_ACCESS_KEY=your-access-key
S3_SECRET_KEY=your-secret-key
S3_PUBLIC_ENDPOINT=https://yourdomain.com/s3  # Public URL for downloads

# Optional
POSTGRES_USER=postgres        # default: postgres
POSTGRES_DB=form-forge        # default: form-forge
S3_BUCKET=form-forge          # default: form-forge
HTTP_PORT=80                  # default: 80
RUST_LOG=info                 # default: info
S3_LIFECYCLE_EXPIRATION_DAYS=7  # default: 7

# OpenTelemetry (optional)
OTEL_EXPORTER_OTLP_ENDPOINT=http://your-signoz:4318
OTEL_SERVICE_NAME=form-forge-api  # default: form-forge-api

Native (Pre-built Binaries)

Pre-built binaries are available on the Releases page.

Note

The pre-built binaries are not code-signed. This means:

  • Windows: Microsoft SmartScreen may show a "Windows protected your PC" warning. Click "More info" and then "Run anyway" to proceed.
  • macOS: Gatekeeper will block the app with a message that it "can't be opened because it is from an unidentified developer." To bypass this, go to System Settings > Privacy & Security, find the blocked app, and click "Open Anyway". Alternatively, right-click the app and select "Open".
  • Linux: No additional steps required.

The application is safe to use — code signing certificates are costly for an open-source project. You can verify the integrity of the binaries by building from source.

OpenTelemetry / Observability

The backend supports exporting traces and metrics via OpenTelemetry. This is disabled by default and can be enabled at runtime.

To enable: Set OTEL_EXPORTER_OTLP_ENDPOINT in your .env file:

# For SigNoz
OTEL_EXPORTER_OTLP_ENDPOINT=http://signoz-otel-collector:4318

# For Jaeger
OTEL_EXPORTER_OTLP_ENDPOINT=http://jaeger:4318

# Optionally customize service name (default: form-forge-api)
OTEL_SERVICE_NAME=my-form-forge-instance

To disable: Leave OTEL_EXPORTER_OTLP_ENDPOINT unset or empty. No traces or metrics will be exported.

Document TTL: Uploaded PDFs are automatically deleted after 1 day (configurable via config/lifecycle.json). Database records are cleaned up via S3 webhook notifications and hourly reconciliation.

Database

Web

PostgreSQL runs on port 5434 (local) or 5432 (Docker). Migrations auto-run on API startup via SQLx.

Manual migration commands:

cd apps/api
sqlx migrate run
sqlx migrate revert

Native

Uses embedded libSQL — no database setup required.

API Documentation

Interactive OpenAPI/Swagger UI available at /swagger-ui/ when the backend is running.

FAQ

The native app crashes on startup with "Protocol error" or error 71 (Native)

This affects NVIDIA GPUs on Wayland. WebKitGTK fails to negotiate the explicit sync protocol with the NVIDIA driver. Fix: set __NV_DISABLE_EXPLICIT_SYNC=1 in your environment.

Permanent fix (recommended) — create a systemd environment drop-in, picked up on every login and visible to all apps including GUI launchers and IDEs:

mkdir -p ~/.config/environment.d
echo '__NV_DISABLE_EXPLICIT_SYNC=1' >> ~/.config/environment.d/nvidia-wayland.conf

Log out and back in for it to take effect.

Terminal only — if you only run the app from a terminal, adding it to your shell profile is sufficient:

# ~/.bashrc / ~/.zshrc
export __NV_DISABLE_EXPLICIT_SYNC=1

# ~/.config/fish/config.fish
set -gx __NV_DISABLE_EXPLICIT_SYNC 1

Note: shell profile variables are not visible to GUI-launched apps (IDEs, app menu shortcuts).

This flag disables explicit sync negotiation at the driver level without disabling DMA-BUF hardware acceleration. Non-NVIDIA systems do not need this.

Contributing

  1. Fork the repository
  2. Create a feature branch
  3. Make changes following code quality standards (Ultracite/Clippy)
  4. Run tests and linting
  5. Submit a pull request

Adding a New Calculation Action

The action system uses a registry pattern — all action metadata is centralized on the CalculationAction enum in Rust. The frontend fetches this metadata at runtime via GET /dnd5e/action-types, so no frontend code changes are needed when adding a new action.

You only need to touch 4 files:

1. Add the enum variant and implement methods

File: crates/actions_core/src/action.rs

Add a new variant to the CalculationAction enum with its field mappings:

#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all_fields = "camelCase")]
pub enum CalculationAction {
    // ... existing variants ...
    InitiativeModifier {
        dexterity_modifier_field_name: String,
        initiative_field_name: String,
    },
}

Then implement the four required methods on the enum. Follow the pattern of existing variants:

  • action_label() — return the PascalCase variant name (e.g., "InitiativeModifier"). Used for persistence.
  • target_field() — return a reference to the field that receives the calculated value (e.g., &self.initiative_field_name).
  • generate_js() — return the JavaScript expression that performs the calculation. Use serialize_field_name() to safely quote field names for JS.
  • action_type_catalog() — add a new ActionTypeMetadata entry describing the action's id (kebab-case), label (PascalCase), and field roles. Each role specifies its key (matching the serde field name), whether it's required, and whether it's the is_target output field.

Optionally add a convenience constructor (e.g., CalculationAction::initiative_modifier(...)).

2. Add the JS helper function (if needed)

File: crates/actions_core/js/dnd-helpers.js

If your action's generate_js() calls a helper function that doesn't exist yet, add it here. This file is embedded into every PDF as document-level JavaScript. Keep functions pure and self-contained — they run inside a PDF viewer's JS engine.

3. Add i18n translations

Files: packages/i18n/locales/en/actions.json and packages/i18n/locales/de/actions.json

Add translations keyed by the kebab-case action id from action_type_catalog(). The frontend looks up translations by convention:

{
  "initiative-modifier": {
    "name": "Initiative Modifier",
    "description": "Calculate initiative from dexterity modifier",
    "roles": {
      "dexterityModifierFieldName": {
        "label": "Dexterity Modifier",
        "hint": "The DEX modifier field"
      },
      "initiativeFieldName": {
        "label": "Initiative",
        "hint": "Where the initiative value will appear"
      }
    }
  }
}

The role keys must match the key values in your FieldRoleMetadata entries from the catalog.

What you do NOT need to change

The following are all handled automatically by the registry pattern:

  • Web handler / API endpoint — the single PUT /dnd5e/{sheet_id}/actions accepts all action types
  • main.rs / openapi.rs — no new service registrations needed
  • Frontend types (AttachActionRequest, etc.) — generic, not per-action
  • API client code (web or Tauri) — uses the action metadata dynamically
  • UI components (useActions, ActionConfigModal, etc.) — data-driven from backend metadata
  • OpenAPI spec / generated types — the CalculationAction schema covers all variants via oneOf
  • Tauri commands — the list_action_types command returns the catalog automatically

Verification checklist

# Backend compiles and tests pass
cd apps/api && cargo test
cd apps/api && cargo clippy --workspace --all-targets --all-features -- -D warnings

# Frontend type-checks and tests pass
pnpm check-types
pnpm test

# Lint passes
pnpm exec ultracite check

About

A PDF form processing application that enables non-technical D&D 5e players to add dynamic calculations to their PDF character sheets without writing JavaScript or understanding PDF AcroForm internals.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors