Thank you for your interest in contributing to OpenVision! This document provides guidelines and information for contributors.
Be respectful, inclusive, and constructive. We welcome contributors of all skill levels.
- Check existing issues first
- Create a new issue with:
- Clear, descriptive title
- Steps to reproduce
- Expected vs actual behavior
- iOS version, device model
- Relevant logs (filter by
[OpenClaw],[Gemini], etc.)
- Check existing discussions first
- Open a new discussion describing:
- The problem you're trying to solve
- Your proposed solution
- Alternative approaches considered
- Fork the repository
- Create a feature branch from
main:git checkout -b feature/your-feature-name
- Make your changes following our code style
- Write/update tests as needed
- Commit with clear messages:
git commit -m "Add: Description of feature" git commit -m "Fix: Description of bug fix" git commit -m "Refactor: Description of refactoring"
- Push and open a Pull Request
Follow Swift API Design Guidelines.
// GOOD: Clear, descriptive names
func startVoiceSession() async throws
func capturePhoto() -> Data?
// BAD: Abbreviated, unclear names
func startVS() async throws
func capPh() -> Data?- @MainActor: Use for all UI-related code and managers
- Singleton pattern: For shared managers (SettingsManager, GlassesManager)
- Protocol-oriented: Define protocols for testability
- Async/await: Prefer over completion handlers
// GOOD: @MainActor isolation
@MainActor
final class MyService: ObservableObject {
@Published var state: ServiceState = .idle
func performAction() async throws {
// Implementation
}
}
// GOOD: Protocol for testability
protocol CameraServiceProtocol {
func capturePhoto() async throws -> Data
}// MARK: - Properties
@Published var isActive = false
private var webSocket: URLSessionWebSocketTask?
// MARK: - Initialization
init() {
// Setup
}
// MARK: - Public Methods
func start() async throws {
// Implementation
}
// MARK: - Private Methods
private func setupConnection() {
// Implementation
}Add documentation to public APIs:
/// Starts a voice conversation session.
///
/// This method connects to the configured AI backend and begins
/// listening for voice input.
///
/// - Throws: `AIBackendError.notConfigured` if no backend is configured
/// - Throws: `AIBackendError.connectionFailed` if connection fails
func startSession() async throws {
// Implementation
}Use typed errors with clear cases:
enum AIBackendError: LocalizedError {
case notConfigured
case connectionFailed(Error)
case authenticationFailed
case timeout
var errorDescription: String? {
switch self {
case .notConfigured:
return "AI backend is not configured"
case .connectionFailed(let error):
return "Connection failed: \(error.localizedDescription)"
case .authenticationFailed:
return "Authentication failed"
case .timeout:
return "Connection timed out"
}
}
}# In Xcode
Cmd+U
# Or via command line
xcodebuild test -scheme OpenVision -destination 'platform=iOS Simulator,name=iPhone 15'- Place tests in
Tests/directory - Mirror the source file structure
- Use mocks for external dependencies
final class OpenClawServiceTests: XCTestCase {
var sut: OpenClawService!
var mockWebSocket: MockWebSocket!
override func setUp() {
super.setUp()
mockWebSocket = MockWebSocket()
sut = OpenClawService(webSocket: mockWebSocket)
}
func testConnectSendsAuthMessage() async throws {
// Given
let expectation = expectation(description: "Auth message sent")
// When
try await sut.connect()
// Then
XCTAssertTrue(mockWebSocket.sentMessages.contains { $0.contains("auth") })
}
}Aim for coverage of:
- Public API methods
- Error handling paths
- Edge cases (empty data, network failures)
OpenVision/
├── App/ # App entry point
├── Config/ # Configuration files
├── Models/ # Data models
├── Services/ # Business logic
│ ├── AIBackend/ # AI protocol & factory
│ ├── OpenClaw/ # OpenClaw WebSocket client
│ ├── GeminiLive/ # Gemini Live client
│ ├── Voice/ # Speech recognition
│ ├── Audio/ # Audio capture/playback
│ ├── Camera/ # Camera services
│ └── TTS/ # Text-to-speech
├── Managers/ # Singleton managers
├── Views/ # SwiftUI views
│ ├── VoiceAgent/ # Voice conversation UI
│ ├── History/ # Conversation history
│ ├── Settings/ # Settings panels
│ └── Components/ # Reusable UI components
└── Tests/ # Unit tests
Use conventional commit format:
Type: Short description (max 72 chars)
Optional longer description explaining the why and how.
Closes #123
Types:
Add: New featureFix: Bug fixRefactor: Code restructuringDocs: Documentation onlyTest: Adding/updating testsChore: Maintenance tasks
- Title: Clear, descriptive title
- Description: Explain what and why
- Testing: Describe how you tested
- Screenshots: Include for UI changes
- Breaking changes: Call out any breaking changes
- Code follows style guidelines
- Self-review completed
- Comments added for complex logic
- Documentation updated if needed
- Tests added/updated
- No compiler warnings introduced
- Version follows Semantic Versioning
- Update
CHANGELOG.md - Create release tag
- GitHub Action builds and publishes
- Open a Discussion
- Check existing issues and PRs
- Read
CLAUDE.mdfor technical context
Thank you for contributing to OpenVision! Your efforts help make AI assistants accessible to everyone.