Deterministic, local-first DNS resolver benchmarking with FastAPI + React.
Most "DNS speed test" pages run from a remote browser context or cloud vantage point, so they measure someone else's network path. DNSpect runs on your machine, sends real DNS queries to candidate resolvers, and ranks outcomes using latency, failure rate, and stability metrics.
- Local DNS benchmark API (
FastAPI) and web UI (React + Vite + TypeScript). - Resolver benchmarking with per-run query samples and progress reporting.
- Engine selection:
drillon Linux when available.dnspythonfallback (including Windows).
- Ranking metrics per resolver:
avg,median,p95,min,max, timeout/failure rates, consistency ratio.
- Deterministic ranking output and recommended resolver selection.
- CSV and JSON export endpoints.
- System DNS detection on Linux/macOS/Windows.
- Guided "apply DNS" modal with platform-aware instructions and verification probe.
- Live ranking panel during benchmark execution with motion budget controls.
- Optional sample inclusion (
include_samples=1) for deep diagnostics. - Last-run persistence in browser storage with schema/version invalidation.
- Runtime queue controls for concurrency and queued jobs via environment variables.
- Queue pressure protection: benchmark start is rejected when
running + queuedexceeds configured capacity. - Terminal state retention is bounded by TTL and max retained states.
- Persistence failure is non-fatal: benchmark completes and exposes
run_storage_warning. - Progress timestamps are monotonic (
last_sample_atdoes not move backwards). - Ranking and recommendation determinism are test-gated against input-order variance.
- Background execution via
ThreadPoolExecutor. - Query schedule precomputed per run to avoid repeated scheduling overhead.
- Sample-heavy payloads excluded by default (
samples: []+sample_count) unless explicitly requested. - Frontend chart rendering supports Top-N limiting for readability/perf.
- Live ranking animation is automatically reduced/disabled for large row counts or reduced-motion users.
- Release binaries generated for:
- Linux x64
- Windows x64
- macOS x64
- macOS arm64
- Development scripts for Linux/macOS (
scripts/dev.sh) and Windows (scripts/dev.ps1). - DNS detection methods:
- Linux:
resolvectlthen/etc/resolv.conf - macOS:
scutil --dns, fallbacknetworksetup - Windows:
ipconfig /all, fallbacknetsh
- Linux:
- Local-first execution: no telemetry or analytics pipeline in this repo.
- Network egress is DNS query traffic to selected resolvers only.
- Benchmark metadata is persisted under a platform user data path resolved by
platformdirs(user_data_path("dnspect", "DNSpect") / "runs"); sample persistence is disabled by default unlessDNS_SPEED_LAB_PERSIST_SAMPLES=1. - UI stores "last run" in browser local storage for convenience.
- Inbound surface: local HTTP server (
uvicorn) on127.0.0.1:8000by default. - Browser/API surface: endpoints under
/api/*for benchmark/probe/export. - Browser-origin policy surface: CORS is restricted to localhost/127.0.0.1 origins (including localhost regex ports).
- Outbound surface: DNS queries to configured resolver IPs; local OS command execution for DNS detection (
resolvectl,scutil,networksetup,ipconfig,netsh) and optionaldrill.
- Resolver input must be literal IP addresses (IPv4/IPv6); invalid values fail validation.
- Domain input must match strict hostname regex and is normalized/lowercased.
- Hard limits exist for workload control:
- Benchmark:
runs 1..300,timeout_sec 0.1..10, max 256 resolvers/queries. - Probe: max 8 resolvers, 32 queries,
runs_per_resolver 1..5,timeout_sec 0.1..5.
- Benchmark:
- Traditional HTTP SSRF is not applicable by design because the backend does not fetch arbitrary URLs.
- Resolver targets are constrained to validated IP literals; domain names are query payloads, not network destinations.
- DNS queries can still reach private/internal IP resolvers if the local operator provides them. This is expected behavior for local diagnostics.
- Stable scoring and ranking keys are applied before response/export.
- Tie-breaking includes resolver identity for stable order.
- Tests explicitly verify ranking/recommendation invariance when resolver input order changes.
- Validation layer:
pydanticrequest models + normalization/deduplication. - Throttling layer: bounded worker pool plus queue limits.
DNS_SPEED_LAB_MAX_CONCURRENT_JOBSDNS_SPEED_LAB_MAX_QUEUED_JOBS
- No OS sandboxing/isolation layer is implemented; process runs with current user privileges.
- Subprocess calls avoid
shell=True, reducing command injection surface.
- Boundary 1: Browser/UI input is untrusted until backend validation.
- Boundary 2: Backend process is trusted code, but depends on local OS/network state and resolver responses (untrusted external systems).
- Boundary 3: If host binding is changed from localhost to a network-accessible interface, threat model changes materially (auth/TLS are not built in).
flowchart TD
A[User Input from React UI] --> B[FastAPI Endpoint]
B --> C[Pydantic Validation and Normalization]
C --> D[BenchmarkManager Queue and State]
D --> E{Engine Selection}
E -->|Linux + drill available| F[drill Query Runner]
E -->|Windows or drill unavailable on macOS or Linux| G[dnspython Query Runner]
F --> H[Stats and Failure Classification]
G --> H
H --> I[Normalized Scoring and Ranking]
I --> J[Recommended Resolver Selection]
J --> K[API Response, JSON Export, CSV Export]
I --> L[(Optional Run Persistence)]
Detailed design notes: docs/ARCHITECTURE.md.
Architecture deep dive: docs/ARCHITECTURE.md.
Prerequisites:
- Python
>=3.11 - Node.js
22.14.0(CI pin)
Linux/macOS:
bash scripts/dev.shWindows (PowerShell):
.\scripts\dev.ps1- Download platform asset from GitHub Releases.
- Run binary.
- Open
http://127.0.0.1:8000.
Release verification guide: docs/RELEASE_VERIFY.md.
- Open app.
- Choose resolvers and mode (
quick,standard,exhaustive). - Start benchmark and watch live ranking.
- Export CSV/JSON when done.
curl -sS -X POST http://127.0.0.1:8000/api/benchmarks \
-H 'Content-Type: application/json' \
-d '{
"mode": "standard",
"runs": 30,
"timeout_sec": 2,
"resolvers": ["1.1.1.1", "8.8.8.8"],
"queries": ["cloudflare.com", "google.com"]
}'Poll status:
curl -sS http://127.0.0.1:8000/api/benchmarks/<benchmark_id>Export:
curl -sS -o dnspect.csv http://127.0.0.1:8000/api/benchmarks/<benchmark_id>/export.csv
curl -sS -o dnspect.json "http://127.0.0.1:8000/api/benchmarks/<benchmark_id>/export.json?include_samples=1"make backend-check
make backend-semgrep
cd frontend && npm run lint && npm run typecheck && npm test && npm run build| Dimension | Typical online checker | DNSpect |
|---|---|---|
| Measurement vantage point | Remote browser/cloud path | Your machine and network path |
| DNS execution model | Often indirect/proxy-based checks | Real DNS lookups via drill/dnspython |
| Determinism | May vary with server-side cohort/load | Deterministic ranking/scoring path with stable tie-breaks |
| Input controls | Limited resolver/query control | Explicit resolvers, query list, run count, timeout, mode |
| Privacy | Traffic and metadata often leave local environment | Local-first operation; no telemetry codepath |
| Reliability signaling | Usually simple latency-only ranking | Latency + failure + stability + reliability guardrail |
| Validation guarantees | Usually undocumented | Enforced request validation and hard workload bounds |
- Scheduled/recurring benchmarks with persistent history
- Configurable alerting when a resolver degrades beyond threshold
- CLI-only mode (headless, no UI dependency)
- Start here:
CONTRIBUTING.md - Security reporting:
SECURITY.md - Provider dataset notes:
docs/PROVIDERS.md
MIT. See LICENSE.
