A PDF form processing application that enables non-technical D&D 5e players to add dynamic calculations to their PDF character sheets without writing JavaScript or understanding PDF AcroForm internals.
- What It Does
- Tech Stack
- Prerequisites
- Setup
- Development
- Project Structure
- Testing
- Deployment
- Database
- API Documentation
- FAQ
- Contributing
Users upload a PDF character sheet, the application extracts form fields, and provides a visual interface to map
calculations between fields (e.g., Strength modifier = (Strength score - 10) / 2). The application generates the
necessary JavaScript and embeds it into the PDF AcroForm structure.
Form Forge ships in two distribution modes:
- Web: A Next.js frontend paired with a Rust API server, deployed via Docker Compose. Uses PostgreSQL for persistence and S3-compatible storage for PDF files.
- Native: A Tauri desktop application that bundles the Rust backend locally. Uses embedded libSQL and filesystem storage — no external services required.
- Rust core crates (domain logic, PDF processing via lopdf)
- Hexagonal architecture with vertical slicing
- Turborepo monorepo with pnpm workspaces
- shadcn/ui components + Tailwind CSS
- React 19
- Actix-Web for HTTP API
- SQLx with PostgreSQL for persistence
- S3-compatible object storage (RustFS)
- Next.js (App Router)
- Tauri v2 desktop shell
- Embedded libSQL for persistence
- Filesystem storage for PDF files
- Rust (stable toolchain)
Recommended: Install via asdf (version manager)
The following tools are managed via asdf (see .tool-versions):
- Node.js
- pnpm
- just
- Python (required for pre-commit)
- pre-commit
# Install asdf plugins and tools
asdf plugin add nodejs
asdf plugin add pnpm
asdf plugin add just
asdf plugin add python
asdf plugin add pre-commit
# Install all versions from .tool-versions
asdf install- Docker + Docker Compose
- Tauri v2 system dependencies for your platform (system libraries, webview, etc.)
pnpm install
# Configure environment
cp .env.example .env
# Edit .env if needed (defaults work for local development)
# Start PostgreSQL + RustFS + Adminer
just up
# Run backend API + Next.js frontend
just devAccess points:
- Web: http://localhost:3000
- API: http://localhost:8081
- Swagger UI: http://localhost:8081/swagger-ui/
- Adminer: http://localhost:8082
pnpm install
cd apps/native
pnpm tauri devBuilding from source produces the Tauri desktop binary. Ensure you have the prerequisites installed, including the Tauri v2 system dependencies for your platform.
git clone https://github.com/maikbasel/form-forge.git
cd form-forge
pnpm install
cd apps/native
pnpm tauri buildThe compiled binaries will be in apps/native/src-tauri/target/release/bundle/.
pnpm build # Build all packages
pnpm lint # Lint all packages
pnpm check-types # Type checking
pnpm exec ultracite fix # Format/fix (Biome-based)
pnpm exec ultracite check # Check for issues without fixingcd apps/api && cargo test
cd apps/api && cargo fmt --all
cd apps/api && cargo clippy --workspace --all-targets --all-features -- -D warningsjust dev # Backend API + Next.js frontend
just up # Start Docker infrastructure only
just down # Stop Docker infrastructure
just be # Run API server only
just web # Next.js web app onlyjust native # Run Tauri desktop app (dev mode)
cd apps/native && pnpm tauri dev # Alternative
cd apps/native && pnpm tauri build # Production build- Rust: Uses rustfmt and clippy (enforced via pre-commit hooks)
- TypeScript: Uses Ultracite (Biome preset) for formatting and linting
Pre-commit Hooks Setup:
If you installed via asdf (recommended), pre-commit is already available. Just run:
pre-commit installThis installs git hooks that automatically run on every commit:
- Backend:
cargo fmt --check,cargo clippy,cargo check - Frontend:
pnpm exec ultracite fix
crates/ # Shared Rust crates (Shared)
sheets_core/ # Sheets domain logic + ports
sheets_pdf/ # PDF adapter for sheets
actions_core/ # Actions domain logic + ports + JS helpers
actions_pdf/ # PDF adapter for actions
common_pdf/ # Shared PDF utilities
common_telemetry/ # Shared telemetry
apps/
api/ # Rust backend — Actix-Web (Web)
crates/
common/ # API-specific utilities (DB config, errors)
sheets/adapters/ # Sheets HTTP, S3, and DB adapters
actions/adapters/ # Actions HTTP adapter
web/ # Next.js web application (Web)
native/ # Tauri desktop application (Native)
src-tauri/crates/
sheets_fs/ # Filesystem adapter for sheets (Native)
sheets_libsql/ # libSQL adapter for sheets (Native)
packages/ # Shared frontend packages (Shared)
ui/ # Shared React component library
api-spec/ # OpenAPI spec and generated API client
typescript-config/ # Shared TypeScript configs
The backend uses hexagonal architecture with two bounded contexts:
- Sheets: Handles PDF form upload, storage, and field extraction
- Actions: Handles JavaScript action generation and PDF modification
See Development Guide for detailed architecture documentation.
# Run all tests (includes testcontainers for integration tests)
cd apps/api && cargo test
# Run specific test
cd apps/api && cargo test <test_name>
# Run tests for specific crate
cd apps/api && cargo test -p <crate_name>
# Example: cd apps/api && cargo test -p sheets-coreTesting stack:
rstestfor parameterized teststestcontainersfor PostgreSQL integration testsmockallfor mocking port traits (via#[cfg_attr(test, automock)])pretty_assertionsfor test assertions
The monorepo uses Turborepo for orchestrating tests across packages.
# Run all tests across the monorepo (using Turborepo)
pnpm test
# Run tests for specific package
pnpm --filter <package-name> test
# Examples:
# pnpm --filter web test
# pnpm --filter ui testE2E tests use Playwright across Chromium, Firefox, and WebKit.
Local (headed browsers, requires Rust toolchain + Playwright browsers installed):
pnpm test:e2eDocker (headless, requires only Docker):
pnpm test:e2e:docker # Run tests
pnpm test:e2e:docker:down # Clean up containers/volumes
# Or using just (runs + cleans up automatically):
just test-e2e-dockerThe repository includes three Docker Compose configurations:
compose.dev.yml- Development (used byjust up)compose.prod.yml- Production (full stack with nginx reverse proxy)compose.e2e.yml- E2E testing (full stack with Playwright runner)
Development (infrastructure only):
just up # Start PostgreSQL + RustFS + Adminer
just down # Stop infrastructureProduction (full stack):
# Configure environment (see .env.example)
cp .env.example .env
# Edit .env with production values
# Build and start all services
docker compose -f compose.prod.yml up --build -d
# View logs
docker compose -f compose.prod.yml logs -f
# Stop services
docker compose -f compose.prod.yml downProduction access points (behind nginx reverse proxy):
- Web UI:
https://<domain>/ - API:
https://<domain>/sheets,https://<domain>/dnd5e - Swagger UI:
https://<domain>/swagger-ui/ - S3 Console:
https://<domain>/s3-console(redirects to RustFS admin UI, login withS3_ACCESS_KEY/S3_SECRET_KEY)
Required environment variables for production:
# Database (required)
POSTGRES_PASSWORD=your-secure-password
# S3 Storage (required)
S3_ACCESS_KEY=your-access-key
S3_SECRET_KEY=your-secret-key
S3_PUBLIC_ENDPOINT=https://yourdomain.com/s3 # Public URL for downloads
# Optional
POSTGRES_USER=postgres # default: postgres
POSTGRES_DB=form-forge # default: form-forge
S3_BUCKET=form-forge # default: form-forge
HTTP_PORT=80 # default: 80
RUST_LOG=info # default: info
S3_LIFECYCLE_EXPIRATION_DAYS=7 # default: 7
# OpenTelemetry (optional)
OTEL_EXPORTER_OTLP_ENDPOINT=http://your-signoz:4318
OTEL_SERVICE_NAME=form-forge-api # default: form-forge-apiPre-built binaries are available on the Releases page.
Note
The pre-built binaries are not code-signed. This means:
- Windows: Microsoft SmartScreen may show a "Windows protected your PC" warning. Click "More info" and then "Run anyway" to proceed.
- macOS: Gatekeeper will block the app with a message that it "can't be opened because it is from an unidentified developer." To bypass this, go to System Settings > Privacy & Security, find the blocked app, and click "Open Anyway". Alternatively, right-click the app and select "Open".
- Linux: No additional steps required.
The application is safe to use — code signing certificates are costly for an open-source project. You can verify the integrity of the binaries by building from source.
The backend supports exporting traces and metrics via OpenTelemetry. This is disabled by default and can be enabled at runtime.
To enable: Set OTEL_EXPORTER_OTLP_ENDPOINT in your .env file:
# For SigNoz
OTEL_EXPORTER_OTLP_ENDPOINT=http://signoz-otel-collector:4318
# For Jaeger
OTEL_EXPORTER_OTLP_ENDPOINT=http://jaeger:4318
# Optionally customize service name (default: form-forge-api)
OTEL_SERVICE_NAME=my-form-forge-instanceTo disable: Leave OTEL_EXPORTER_OTLP_ENDPOINT unset or empty. No traces or metrics will be exported.
Document TTL: Uploaded PDFs are automatically deleted after 1 day (configurable via config/lifecycle.json). Database records are cleaned up via S3 webhook notifications and hourly reconciliation.
PostgreSQL runs on port 5434 (local) or 5432 (Docker). Migrations auto-run on API startup via SQLx.
Manual migration commands:
cd apps/api
sqlx migrate run
sqlx migrate revertUses embedded libSQL — no database setup required.
Interactive OpenAPI/Swagger UI available at /swagger-ui/ when the backend is running.
This affects NVIDIA GPUs on Wayland. WebKitGTK fails to negotiate the explicit sync protocol with
the NVIDIA driver. Fix: set __NV_DISABLE_EXPLICIT_SYNC=1 in your environment.
Permanent fix (recommended) — create a systemd environment drop-in, picked up on every login and visible to all apps including GUI launchers and IDEs:
mkdir -p ~/.config/environment.d
echo '__NV_DISABLE_EXPLICIT_SYNC=1' >> ~/.config/environment.d/nvidia-wayland.confLog out and back in for it to take effect.
Terminal only — if you only run the app from a terminal, adding it to your shell profile is sufficient:
# ~/.bashrc / ~/.zshrc
export __NV_DISABLE_EXPLICIT_SYNC=1
# ~/.config/fish/config.fish
set -gx __NV_DISABLE_EXPLICIT_SYNC 1Note: shell profile variables are not visible to GUI-launched apps (IDEs, app menu shortcuts).
This flag disables explicit sync negotiation at the driver level without disabling DMA-BUF hardware acceleration. Non-NVIDIA systems do not need this.
- Fork the repository
- Create a feature branch
- Make changes following code quality standards (Ultracite/Clippy)
- Run tests and linting
- Submit a pull request
The action system uses a registry pattern — all action metadata is centralized on the
CalculationAction enum in Rust. The frontend fetches this metadata at runtime via
GET /dnd5e/action-types, so no frontend code changes are needed when adding a new action.
You only need to touch 4 files:
File: crates/actions_core/src/action.rs
Add a new variant to the CalculationAction enum with its field mappings:
#[derive(Debug, Serialize, Deserialize)]
#[serde(rename_all_fields = "camelCase")]
pub enum CalculationAction {
// ... existing variants ...
InitiativeModifier {
dexterity_modifier_field_name: String,
initiative_field_name: String,
},
}Then implement the four required methods on the enum. Follow the pattern of existing variants:
action_label()— return the PascalCase variant name (e.g.,"InitiativeModifier"). Used for persistence.target_field()— return a reference to the field that receives the calculated value (e.g.,&self.initiative_field_name).generate_js()— return the JavaScript expression that performs the calculation. Useserialize_field_name()to safely quote field names for JS.action_type_catalog()— add a newActionTypeMetadataentry describing the action's id (kebab-case), label (PascalCase), and field roles. Each role specifies itskey(matching the serde field name), whether it'srequired, and whether it's theis_targetoutput field.
Optionally add a convenience constructor (e.g., CalculationAction::initiative_modifier(...)).
File: crates/actions_core/js/dnd-helpers.js
If your action's generate_js() calls a helper function that doesn't exist yet, add it here. This
file is embedded into every PDF as document-level JavaScript. Keep functions pure and
self-contained — they run inside a PDF viewer's JS engine.
Files: packages/i18n/locales/en/actions.json and packages/i18n/locales/de/actions.json
Add translations keyed by the kebab-case action id from action_type_catalog(). The frontend looks
up translations by convention:
{
"initiative-modifier": {
"name": "Initiative Modifier",
"description": "Calculate initiative from dexterity modifier",
"roles": {
"dexterityModifierFieldName": {
"label": "Dexterity Modifier",
"hint": "The DEX modifier field"
},
"initiativeFieldName": {
"label": "Initiative",
"hint": "Where the initiative value will appear"
}
}
}
}The role keys must match the key values in your FieldRoleMetadata entries from the catalog.
The following are all handled automatically by the registry pattern:
- Web handler / API endpoint — the single
PUT /dnd5e/{sheet_id}/actionsaccepts all action types main.rs/openapi.rs— no new service registrations needed- Frontend types (
AttachActionRequest, etc.) — generic, not per-action - API client code (web or Tauri) — uses the action metadata dynamically
- UI components (
useActions,ActionConfigModal, etc.) — data-driven from backend metadata - OpenAPI spec / generated types — the
CalculationActionschema covers all variants viaoneOf - Tauri commands — the
list_action_typescommand returns the catalog automatically
# Backend compiles and tests pass
cd apps/api && cargo test
cd apps/api && cargo clippy --workspace --all-targets --all-features -- -D warnings
# Frontend type-checks and tests pass
pnpm check-types
pnpm test
# Lint passes
pnpm exec ultracite check