Golden test vectors for validating QP Conduit's structured JSONL audit log format.
QP Conduit writes one JSON object per line to ~/.config/<app-name>/audit.log. Each line is a self-contained JSON object (JSONL format, not a JSON array).
Every audit entry must contain these six fields:
| Field | Type | Description |
|---|---|---|
timestamp |
string | ISO 8601 UTC timestamp (YYYY-MM-DDTHH:MM:SSZ) |
action |
string | Operation identifier (see valid actions below) |
status |
string | "success" or "failure" |
message |
string | Human-readable description of the operation |
user |
string | System username that executed the command |
details |
object | Structured metadata (action-specific, may be empty {}) |
| Action | Script | Description |
|---|---|---|
conduit_setup |
conduit-setup.sh | Conduit initialized (dnsmasq, Caddy, internal CA) |
service_register |
conduit-register.sh | Service registered with DNS, TLS, and routing |
service_deregister |
conduit-deregister.sh | Service deregistered and archived |
cert_rotate |
conduit-certs.sh | Certificate revoked and reissued for a service |
cert_revoke |
conduit-deregister.sh | Certificate archived during deregistration |
dns_flush |
conduit-dns.sh | DNS cache flushed |
dns_add |
conduit-register.sh | DNS entry added for a service |
dns_remove |
conduit-deregister.sh | DNS entry removed for a service |
health_change |
conduit-status.sh | Service health status transition (up/down) |
monitor_alert |
conduit-monitor.sh | Hardware threshold exceeded (GPU temp, disk, memory) |
*_error |
Any (ERR trap) | Error trap fired during execution |
Each action writes specific fields in the details object:
conduit_setup:
{"domain": "qp.local", "dns": "dnsmasq", "proxy": "caddy", "ca": "ed25519"}service_register:
{"name": "hub", "host": "127.0.0.1", "port": "8090", "protocol": "https", "health_path": "/healthz", "tls": true}service_deregister:
{"name": "hub"}cert_rotate:
{"name": "hub"}cert_revoke:
{"name": "hub"}dns_flush:
{}dns_add:
{"name": "hub", "host": "127.0.0.1"}dns_remove:
{"name": "hub"}health_change:
{"name": "hub", "previous": "healthy", "current": "down", "checked_at": "2026-04-04T14:30:00Z"}monitor_alert:
{"metric": "gpu_temperature", "value": 92, "threshold": 85, "unit": "celsius", "gpu_index": 0}error trap:
{"line": "42", "script": "conduit-register"}A conformant audit entry must satisfy all of the following:
- The entry is valid JSON (parseable by
jq) - All six required fields are present
timestampmatches the pattern^\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}Z$actionis a non-empty stringstatusis"success"or"failure"messageis a string (may be empty)useris a non-empty stringdetailsis a JSON object (not null, not an array, not a scalar)- No secrets or tokens appear unmasked in the
messagefield - The entry occupies exactly one line (no embedded newlines in the serialized JSON)
When qp-capsule is installed, each audit entry is also sealed as a tamper-evident Capsule in capsules.db. The sealed entry includes:
- SHA3-256 hash of the JSON content
- Ed25519 signature
- Chain linkage to the previous Capsule
The JSONL audit log is the fast local index. The Capsule database is the cryptographic source of truth. Both contain the same entries.
Verify integrity:
qp-capsule verify --db ~/.config/qp-conduit/capsules.dbThe audit-fixtures.json file contains golden test vectors. Each fixture has:
| Field | Type | Description |
|---|---|---|
description |
string | What this fixture tests |
entry |
object | The exact JSON audit log entry |
valid |
boolean | Whether a conformant validator should accept this entry |
For every fixture where valid is true:
- Serialize
entryas compact JSON (no whitespace) - Parse it back
- Confirm all six required fields are present and correctly typed
- Confirm
timestamp,status, anddetailspass validation rules
For every fixture where valid is false:
- Attempt to validate
entry - Confirm the validator rejects it
- The
descriptionfield explains what is wrong
New fixtures must:
- Test a specific edge case or action type
- Be deterministic (no random data)
- Include
description,entry, andvalid - Use realistic but fictional data