Principle: The phone is a dumb scanner β it captures QR frames and exports them as JSON for USB/ADB retrieval. Zero crypto runs on the device.
The mobile app's primary (and currently only active) workflow is fully offline JSON export. No network, no bridge server, no WebSocket.
ββββββββββββββββ USB / ADB pull ββββββββββββββββββββ
β Mobile App β βββββββββββββββββββββββββββββββΊ β Desktop CLI β
β (React β meow-capture-*.json (Downloads) β (Python) β
β Native) β β β
β β β β decode_gif() β
ββββββββ¬ββββββββ ββββββββββ¬ββββββββββ
β β
Camera + Fountain decode
QR scan + AES-256-GCM
(on device) (on workstation)
- Phone opens camera β detects animated GIF displayed on air-gapped screen.
- Each frame is decoded to raw QR bytes on-device (using
react-native-vision-camera). - Frames are collected in memory by
useCapture(indices + base64 payloads). - User taps Export β biometric gate (if available) β JSON written to Downloads.
- User retrieves file via USB/ADB (
adb pull) or iOS Files/AirDrop. - Desktop CLI reassembles fountain-coded stream β decrypt β verify.
If USB is unavailable, ExportScreen can display the captured JSON as a series of QR codes for the desktop to scan back (reverse optical transfer).
| Boundary | Trust | Notes |
|---|---|---|
| Phone β air-gapped screen | Optical | Camera captures QR codes off screen |
| Phone β Desktop (USB) | Physical transfer | JSON file, no live connection |
| Phone storage | Untrusted | Phone never sees plaintext, keys, or passwords |
| Clipboard | Time-limited | ADB commands auto-wiped after 45 s |
Key invariant: The phone never receives the password, derived key, or plaintext. If the phone is compromised, the attacker gains only the same ciphertext visible on the air-gapped screen.
Status: Not implemented in the mobile app. Bridge server and protocol are designed but not wired. See
bridge/for reference implementations. The local JSON export flow above is the primary and only active mode. All release UI entry points use the JSON export path. The bridge mode is documented here for future development reference only.
For real-time streaming without USB, a future mode will support a local WebSocket bridge between the phone and a workstation on the same LAN or USB.
ββββββββββββββββ JSON / WebSocket ββββββββββββββββββββ
β Mobile App β βββββββββββββββββββββββββββββββββββΊ β meow-bridge CLI β
β (React β { frame_index, qr_bytes_b64, β¦ } β (Python) β
β Native) β βββββββββββββββββββββββββββββββββββ β β
β β { status, progress, β¦ } β β decode_gif() β
ββββββββ¬ββββββββ ββββββββββ¬ββββββββββ
β β
Camera + Fountain decode
QR scan + AES-256-GCM
(on device) (on workstation)
All messages are UTF-8 JSON, one message per WebSocket frame.
Sent once when the user begins scanning.
{
"type": "scan_start",
"device_id": "iPhone-15-Pro",
"timestamp_ms": 1706745600000
}Sent for each QR code frame captured by the camera.
{
"type": "frame",
"seq": 0,
"qr_bytes_b64": "<base64-encoded raw QR payload>",
"timestamp_ms": 1706745600100
}| Field | Type | Description |
|---|---|---|
seq |
int | Monotonically increasing sequence number (phone-side counter) |
qr_bytes_b64 |
string | Base64-encoded raw bytes from the QR code |
timestamp_ms |
int | Unix epoch milliseconds when the frame was captured |
Sent when the user stops scanning (or all frames received).
{
"type": "scan_end",
"total_frames_sent": 42,
"timestamp_ms": 1706745602000
}Sent after each frame is received and validated.
{
"type": "ack",
"seq": 0,
"accepted": true,
"reason": ""
}Sent periodically during decoding to update the phone UI.
{
"type": "progress",
"frames_received": 35,
"frames_needed": 42,
"blocks_decoded": 20,
"blocks_total": 28,
"percent": 71.4
}Sent when decoding completes or fails.
{
"type": "result",
"success": true,
"output_file": "secret.pdf",
"output_size": 102400,
"elapsed_s": 3.2,
"error": null
}Sent on fatal errors.
{
"type": "error",
"code": "HMAC_FAIL",
"message": "Wrong password or corrupted data"
}| Code | Meaning |
|---|---|
HMAC_FAIL |
HMAC verification failed (wrong password) |
DECODE_INCOMPLETE |
Not enough frames received |
QR_CORRUPT |
QR payload could not be parsed |
MANIFEST_INVALID |
First frame (manifest) failed validation |
INTERNAL |
Unexpected server error |
# Start bridge server (waits for phone connection)
meow-bridge --output secret.pdf --password "hunter2" --port 9999
# With verbose + tamper report
meow-bridge --output secret.pdf -p "hunter2" --verbose --tamper-reportThe bridge server is intentionally minimal:
- Opens a WebSocket server on
localhost:<port>. - Accepts
framemessages β collects raw QR bytes in memory. - On
scan_end(or sufficient frames), assembles the byte stream as if reading from a GIF and passes todecode_gif()internals (manifest parsing β fountain decode β decrypt). - Streams
progressandresultback to the phone.
The server does not persist frames to disk (to minimize attack surface).
{
"react-native-vision-camera": "^4.0.0",
"react-native-worklets-core": "^1.0.0"
}<MeowScanner
bridgeUrl="ws://192.168.1.42:9999"
onProgress={(p) => setProgress(p.percent)}
onResult={(r) => Alert.alert(r.success ? "Done!" : "Failed", r.message)}
onError={(e) => Alert.alert("Error", e.message)}
/>- Request camera permission
- Open camera with QR code detection enabled
- On each QR detection, base64-encode the raw bytes and send as
frame - De-duplicate frames (skip if same bytes as previous frame)
- Display progress from
progressmessages - Show result when
resultmessage arrives
- No crypto β no key derivation, no decryption
- No password handling β password is entered on the CLI side
- No file storage β raw bytes are forwarded, not saved
- No internet access β only local WebSocket to CLI
Phone and workstation on same LAN. Phone connects to
ws://<workstation-ip>:9999.
Pros: Wireless, easy setup. Cons: Requires same network; mDNS/Bonjour can simplify discovery.
Use ADB (Android) or a Lightning relay (iOS) to forward a local port:
# Android
adb forward tcp:9999 tcp:9999
# iOS (via libimobiledevice)
iproxy 9999 9999Phone connects to ws://localhost:9999 (which is forwarded to the
workstation).
Pros: No network exposure. Better for high-security use. Cons: Requires cable + tooling.
AppNavigator (native-stack)
βββ SplashScreen
βββ OnboardingScreen
βββ HomeScreen
β βββ CalibrationWizard β 5-step preflight (permissions, QR test, light, brightness, thermal)
β βββ DiagnosticsPanel β hidden long-press panel (JS lag, heap, FPS, thermal heuristic, safe-to-share export)
β βββ RequestQR modal β Camera + useCodeScanner to scan request from sender screen
β βββ [video import button] β useVideoImport (feature-flagged OFF β hidden from release UI)
βββ CaptureScreen
β βββ CameraPreview β pinch zoom, exposure nudge, shake detection
β βββ CatWhiskerHUD β animated progress ring (Reanimated 3)
β βββ ProgressHUD β confidence label, safeToStop pill, decode-rate row
β βββ CaptureCoachPanel β live coaching hints (shake / light / decode rate)
βββ ExportScreen β biometric gate, SHA-256 verify, ADB + filename copy
βββ SettingsScreen β Strict / Convenience security mode toggle (MMKV-backed)
| Hook | Purpose |
|---|---|
useQRScanner |
ML-Kit/AVFoundation code scanner + decode-rate / duplicate-rate ring buffers |
useSessionManager |
Fountain-decode state machine; exposes decodeRate, duplicateRate |
useCapture |
useReducer-based capture state + MMKV checkpoint (indices only) |
useSecurityMode |
MMKV-backed strict/convenience toggle |
useVideoImport |
TurboModule bridge stub for local video frame extraction (feature-flagged OFF in release) |
| Component | File | Description |
|---|---|---|
CaptureCoachPanel |
components/CaptureCoachPanel.tsx |
Priority-ranked live hints derived from decode rate, duplicate rate, shake, exposure |
CalibrationWizard |
components/CalibrationWizard.tsx |
Modal preflight checklist with live QR scan test |
DiagnosticsPanel |
components/DiagnosticsPanel.tsx |
Hidden long-press overlay: JS lag via rAF, heap, thermal, FPS |
SettingsScreen |
screens/SettingsScreen.tsx |
Strict vs Convenience security mode with full implications table |
mobile/
βββ ARCHITECTURE.md β this file
βββ README.md β platform stubs overview
βββ bridge/
β βββ protocol.py β wire protocol message classes
β βββ server.py β WebSocket bridge server (future)
βββ react-native/
β βββ MeowScanner.tsx β scanner component (reference impl)
β βββ useBridge.ts β WebSocket hook
βββ android/
β βββ MeowCrypto.kt β native crypto stubs
βββ ios/
βββ MeowCrypto.swift β native crypto stubs
- Phone never receives password or derived key
- WebSocket binds to
localhostby default (no remote access) - USB transport preferred for high-security scenarios
- No QR frame data persisted on phone
- Bridge server validates frame sizes (max 4 KiB per QR payload)
- Rate limiting: max 100 frames/second to prevent DoS
- TLS optional for LAN (
wss://) but not required for localhost
- mDNS/Bonjour discovery β phone auto-discovers CLI on LAN
- Bidirectional mode β encode on workstation, phone displays QR for another air-gapped device
- Multi-device fan-out β multiple phones scan different portions (clowder mode)
- Flutter / native alternatives β port scanner to non-RN frameworks
The app currently uses the Old Architecture (Paper renderer + Bridge modules).
Migration to the New Architecture (newArchEnabled=true) is tracked below.
Android β android/gradle.properties:
newArchEnabled=trueiOS β ios/Podfile post-install:
:fabric_enabled => true| Dependency | New Arch Ready | Notes |
|---|---|---|
react-native-vision-camera v4 |
β | Native Fabric renderer since v3 |
react-native-reanimated v3 |
β | Full TurboModule + Fabric support |
react-native-gesture-handler v2 |
β | Fabric-aware since 2.12 |
react-native-mmkv |
β | Pure JSI, no Bridge dependency |
react-native-biometrics |
Works but not yet a TurboModule β Bridge interop layer handles it | |
react-native-haptic-feedback |
Bridge interop; consider expo-haptics TurboModule alternative |
|
react-native-sensors |
β | Not New Arch compatible β needs replacement with expo-sensors or a custom TurboModule |
react-native-sound |
Bridge interop; consider expo-av TurboModule alternative |
|
react-native-svg |
β | Fabric support since 15.0 |
react-native-document-picker |
Bridge interop β works under compatibility layer | |
react-native-fs |
Bridge interop β consider expo-file-system for native TurboModule |
|
react-native-safe-area-context |
β | Full Fabric support |
- Enable
newArchEnabled=truein Android gradle.properties - Test all screens β PaperβFabric renderer change can cause layout shifts
- Replace
react-native-sensorsβ highest-priority blocker:- Option A:
expo-sensors(TurboModule, drop-in replacement) - Option B: Custom JSI accelerometer module (minimal surface area)
- Option A:
- Verify Bridge interop β modules marked
β οΈ should work via the interop layer but may have minor timing differences - Performance benchmark β measure camera pipeline latency before/after:
- Frame decode rate should stay β₯ 15 fps at 30 fps camera
- Gesture response (pinch zoom) should be < 16ms
- Low risk: Vision Camera, Reanimated, Gesture Handler β already Fabric-native
- Medium risk: MMKV (JSI adapter change), Biometrics, Sound (interop layer)
- High risk:
react-native-sensorsβ must be replaced before enabling New Arch - Estimated effort: 2-3 days for a developer familiar with the codebase