A Flutter plugin that provides a bridge to Meta's Wearables Device Access Toolkit (DAT), enabling integration with Meta AI Glasses for iOS and Android.
- flutter_meta_wearables_dat
The Meta Wearables Device Access Toolkit is currently in developer preview. During this phase:
- You can use the SDK to build, prototype, and test your app.
- You can distribute to testers within your organization or team (e.g. via the beta testing platform in the Meta Wearables Developer Center).
- Publishing to the general public is limited: only select partners can publish their DAT integrations to public app stores. Most apps using DAT cannot be published publicly yet.
Meta is running the preview to test, learn, and refine the toolkit; broader publishing (general availability) is planned for 2026. For full details, see Introducing the Meta Wearables Device Access Toolkit and the Meta Wearables FAQ.
- Ray-Ban Meta (Gen 1 & 2)
- Meta Ray-Ban Display
- Oakley Meta HSTN
- Oakley Meta Vanguard
To set up your glasses for development, you must enable Developer mode in the Meta AI app. See Enable developer mode in the Meta AI app for instructions.
Minimum deployment target: iOS 17.0
Add the following to your Info.plist:
<key>NSBluetoothAlwaysUsageDescription</key>
<string>Needed to connect to Meta AI Glasses</string>
<key>LSApplicationQueriesSchemes</key>
<array>
<string>fb-viewapp</string>
</array>
<key>UISupportedExternalAccessoryProtocols</key>
<array>
<string>com.meta.ar.wearable</string>
</array>
<key>UIBackgroundModes</key>
<array>
<string>bluetooth-peripheral</string>
<string>external-accessory</string>
</array>
<!-- Deep link callback from Meta AI app - scheme must match AppLinkURLScheme below -->
<key>CFBundleURLTypes</key>
<array>
<dict>
<key>CFBundleURLSchemes</key>
<array>
<string>myexampleapp</string>
</array>
</dict>
</array>
<!-- Meta Wearables Device Access Toolkit Setup -->
<key>MWDAT</key>
<dict>
<key>AppLinkURLScheme</key>
<!-- Must match CFBundleURLSchemes above so Meta AI redirects to a URL this app handles -->
<string>myexampleapp://</string>
<key>MetaAppID</key>
<!-- Without Developer Mode, use the ID from the app registered in Wearables Developer Center -->
<string>YOUR_APP_ID</string>
<key>ClientToken</key>
<!-- Without Developer Mode, use the ClientToken from Wearables Developer Center -->
<string>YOUR_CLIENT_TOKEN</string>
<key>TeamID</key>
<!-- Your Apple Developer Team ID - Set this in Xcode under Signing & Capabilities -->
<string>$(DEVELOPMENT_TEAM)</string>
<key>Analytics</key>
<dict>
<key>OptOut</key>
<true/>
</dict>
</dict>Add the following to your app's AndroidManifest.xml:
<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.BLUETOOTH_CONNECT" />
<uses-permission android:name="android.permission.INTERNET" />
<application>
<!-- Required: Your application ID from Wearables Developer Center -->
<!-- Use "0" for Developer Mode -->
<meta-data
android:name="com.meta.wearable.mwdat.APPLICATION_ID"
android:value="0" />
<!-- Optional: Disable analytics -->
<meta-data
android:name="com.meta.wearable.mwdat.ANALYTICS_OPT_OUT"
android:value="true" />
<!-- Deep link callback from Meta AI app -->
<activity android:name=".MainActivity" android:launchMode="singleTop">
<intent-filter>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.BROWSABLE" />
<category android:name="android.intent.category.DEFAULT" />
<data android:scheme="myexampleapp" />
</intent-filter>
</activity>
</application>Add the GitHub Packages repository to your settings.gradle.kts. First, add the necessary imports at the top of the file:
import java.util.Properties
import kotlin.io.path.div
import kotlin.io.path.exists
import kotlin.io.path.inputStreamThen add the repository configuration:
val localProperties =
Properties().apply {
val localPropertiesPath = rootDir.toPath() / "local.properties"
if (localPropertiesPath.exists()) {
load(localPropertiesPath.inputStream())
}
}
dependencyResolutionManagement {
// Flutter's Gradle plugin adds a maven repo at the project level.
repositoriesMode.set(RepositoriesMode.PREFER_SETTINGS)
repositories {
google()
mavenCentral()
maven {
url = uri("https://maven.pkg.github.com/facebook/meta-wearables-dat-android")
credentials {
username = "" // not needed
password = System.getenv("GITHUB_TOKEN") ?: localProperties.getProperty("github_token")
}
}
}
}Note: We use PREFER_SETTINGS instead of FAIL_ON_PROJECT_REPOS because Flutter's Gradle plugin needs to add repositories at the project level.
Set a GitHub token with read:packages scope via:
- Environment variable:
GITHUB_TOKEN - Or in
local.properties:github_token=your_token_here
Wearables permission requests use Wearables.RequestPermissionContract, which requires
the hosting Android Activity to be a ComponentActivity. In a Flutter app this means
you must extend FlutterFragmentActivity (which itself extends FragmentActivity
→ ComponentActivity), not FlutterActivity.
Update your android/app/src/main/kotlin/.../MainActivity.kt (or .java) to:
package com.yourcompany.yourapp
import io.flutter.embedding.android.FlutterFragmentActivity
class MainActivity : FlutterFragmentActivity()If you keep using FlutterActivity, the DAT permission sheet will not be able to
register an ActivityResultLauncher and camera permission requests will fail.
Add and configure your app in the Meta Wearables Developer Center to obtain your MetaAppID and complete the setup.
The plugin follows Meta's integration lifecycle as documented in the Meta Wearables Developer Documentation:
- Call
MetaWearablesDat.requestAndroidPermissions()before any other DAT calls - This requests Bluetooth and Internet runtime permissions required by the DAT SDK
- Important: On Android, the DAT SDK is not initialized until these permissions are granted. This is critical for device discovery to work correctly.
- No-op on iOS
- User taps a call-to-action in your app (e.g., "Connect my glasses")
- Call
MetaWearablesDat.startRegistration()to open the Meta AI app - User confirms the connection in Meta AI app
- Meta AI returns to your app via deep link
- Handle the callback URL with
MetaWearablesDat.handleUrl(url)to complete registration - Monitor registration state via
MetaWearablesDat.registrationStateStream() - Monitor active device availability via
MetaWearablesDat.activeDeviceStream()
- When your app first attempts to access the camera, request permission
- Call
MetaWearablesDat.requestCameraPermission()to show the Meta AI permission bottom sheet - User can allow always, allow once, or deny
- Once registered and permissions are granted, start a streaming session
- Call
MetaWearablesDat.startStreamSession(deviceUUID)— returns atextureId - Render the live video feed using Flutter's
Texturewidget with the returned ID - Monitor session state via
MetaWearablesDat.streamSessionStateStream() - Monitor errors via
MetaWearablesDat.streamSessionErrorStream() - Call
MetaWearablesDat.stopStreamSession(deviceUUID)to end the session
// Start streaming — returns a texture ID for zero-copy rendering
final textureId = await MetaWearablesDat.startStreamSession(
deviceUUID,
fps: 24,
streamQuality: StreamQuality.low,
videoCodec: VideoCodec.raw, // or VideoCodec.hvc1 (iOS only, supports background streaming)
);
// Render the live video feed
Texture(textureId: textureId);
// Monitor session state
MetaWearablesDat.streamSessionStateStream().listen((state) {
// StreamSessionState: stopped, waitingForDevice, starting, streaming, paused, stopping
print('Session state: $state');
});
// Monitor errors (e.g., thermalCritical, hingesClosed, permissionDenied)
MetaWearablesDat.streamSessionErrorStream().listen((error) {
print('Session error: ${error.code} — ${error.message}');
if (error.isThermalCritical) {
// Device overheating — streaming paused automatically
}
});
// Capture a photo during streaming
final photo = await MetaWearablesDat.capturePhoto(
deviceUUID,
format: PhotoCaptureFormat.jpeg, // or PhotoCaptureFormat.heic
);
// Stop streaming when done
await MetaWearablesDat.stopStreamSession(deviceUUID);Video frames are pushed directly from native (CVPixelBuffer on iOS, SurfaceTexture on Android) to the Flutter engine — no JPEG encoding, no byte copying, no Dart-side decoding.
| Codec | Platform | Description |
|---|---|---|
VideoCodec.raw |
iOS & Android | Raw uncompressed frames. Foreground only — frame delivery stops when app is backgrounded. Default. |
VideoCodec.hvc1 |
iOS only | Compressed HEVC frames. Works in both foreground and background. On iOS, frames are decoded via VideoToolbox's hardware HEVC decoder. Ignored on Android. |
| Quality | Resolution |
|---|---|
StreamQuality.low |
360 x 640 |
StreamQuality.medium |
504 x 896 |
StreamQuality.high |
720 x 1280 |
Valid FPS values: 2, 7, 15, 24, 30.
For use cases that need pixel-level access — OCR, on-device ML inference, computer vision — use captureStreamFrame. This rasterizes the Flutter texture on the Dart side and returns raw RGBA bytes:
import 'dart:async';
// After startStreamSession returns a textureId...
Timer? _frameTimer;
void startFrameProcessing(int textureId) {
_frameTimer = Timer.periodic(const Duration(milliseconds: 400), (_) async {
final frame = await MetaWearablesDat.captureStreamFrame(textureId);
if (frame == null) return;
// frame.bytes is raw RGBA — feed directly to ML Kit, Vision, etc.
// frame.width → 720
// frame.height → 1280
await runOcr(frame.bytes, frame.width, frame.height);
});
}
void stopFrameProcessing() => _frameTimer?.cancel();Parameters:
textureId— the ID returned bystartStreamSession(required)width/height— capture resolution, defaults to720×1280(glasses native resolution)format—FrameFormat.rawRgba(default),FrameFormat.rawStraightRgba, orFrameFormat.png
Memory note: Raw RGBA at 720×1280 is ~3.7 MB per frame. Capture on demand (every 200–500 ms is typical for OCR/ML) rather than every rendered frame.
Note: See the example app for a complete implementation.
This plugin includes configuration files for AI coding assistants (Claude Code, Cursor, GitHub Copilot). Install them to give your AI assistant full context on DAT integration patterns:
# One-liner — installs all tools
curl -sL https://raw.githubusercontent.com/rodcone/flutter_meta_wearables_dat/main/install-skills.sh | bash
# Or install specific tools only
curl -sL https://raw.githubusercontent.com/rodcone/flutter_meta_wearables_dat/main/install-skills.sh | bash -s claude
curl -sL https://raw.githubusercontent.com/rodcone/flutter_meta_wearables_dat/main/install-skills.sh | bash -s cursor
# Or from cloned repo
./install-skills.sh allYour AI assistant will auto-discover the config when you open the project. See also: AI-Assisted Development
If you run into issues, try these steps first:
- Update Meta AI app and glasses firmware: Ensure you have the latest version of the Meta AI app installed on your phone, and within the app, check for and install any available firmware updates for your glasses. See version dependencies.
- Verify installation: Ensure you have followed all installation steps above, including configuration in your code and in the Meta Wearables Developer Center.
- Restart your glasses — If the glasses don't connect or the stream doesn't start, try restarting them:
- Switch the power button to off.
- Press and hold the capture button, then slide the power switch on.
- Release the capture button when the LED turns red (don't wait until the LED turns white).
- From official docs: See Known Issues, FAQ and Report a bug.
Common issues:
- Registration deep link not returning — If registration opens the Meta AI app but the callback does not return to your app, verify that your URL scheme matches the one registered in the Meta Wearables Developer Center. On iOS, ensure
CFBundleURLSchemesinInfo.plist(andAppLinkURLSchemein theMWDATdict) use the same scheme. On Android, ensure thedata android:schemein your activity's intent-filter matches that scheme.
Still having issues? — Open a GitHub issue with all the details you can provide. This helps us pinpoint the problem and assist you more efficiently.
The example app is a clone of the Meta's sample Camera Access native app.
Here's a demo showing how the DAT integration looks like:
Contributions are welcome! Feel free to open issues for bugs or feature requests, and pull requests for improvements.
MIT License — see LICENSE for details.
