Skip to content

HeadTracker/MonitorTracking.py blink click or dwell click feature and improved flickerless mouse motion #12

@trappedinspacetime

Description

@trappedinspacetime

First of all, congratulations on your project. I’ve repeatedly searched online—especially on GitHub—for open-source eye-tracking systems like this, but except for yours and one other, none of them worked the way I needed. I have SMA Type 3 and can barely use my arms. Using my computer with an eye-tracking system would be much easier for me.

On my Android phone, I use an app called ‘Eva Facial Mouse Pro’. Even though it’s the free version, it’s quite stable and usable. I can operate it with my powered wheelchair.

I use Ubuntu 22.04 MATE. I tried to create an eye-tracking program using my computer’s camera with the help of some AI, but unfortunately I wasn’t very successful.

This repository: https://github.com/cmauri/eva_facial_mouse contains the source code of the app I used on my android mobile phone. But I don’t know much about Java, and the code looks quite old. I’m not even sure if it’s compatible anymore.

In MonitorTracking.py, I removed the keyboard import because it requires superuser permissions on Linux. I replaced it as shown below, and now it works.

      diff --git a/MonitorTracking.py b/MonitorTracking.py
      index abcdef1..abcdef2 100644
      --- a/MonitorTracking.py
      +++ b/MonitorTracking.py
      @@ -9,7 +9,6 @@ import pyautogui
       import math
       import threading
       import time
      -import keyboard
       
       MONITOR_WIDTH, MONITOR_HEIGHT = pyautogui.size()
       CENTER_X = MONITOR_WIDTH // 2
      @@ -248,14 +247,6 @@ while cap.isOpened():
       
           cv2.imshow("Head-Aligned Cube", frame)
           cv2.imshow("Facial Landmarks", landmarks_frame)
      -
      -    # Toggle mouse control with F7 (requires root on Linux → removed)
      -    if keyboard.is_pressed('f7'):
      -        mouse_control_enabled = not mouse_control_enabled
      -        print(f"[Mouse Control] {'Enabled' if mouse_control_enabled else 'Disabled'}")
      -        time.sleep(0.3)  # debounce
      -
       
           key = cv2.waitKey(1) & 0xFF
           if key == ord('q'):
      @@ -264,6 +255,13 @@ while cap.isOpened():
               calibration_offset_pitch = 180 - raw_pitch_deg
               print(f"[Calibrated] Offset Yaw: {calibration_offset_yaw}, Offset Pitch: {calibration_offset_pitch}")
       
      +    # New toggle for mouse control using 'f'
      +    # (OpenCV cannot detect F7 reliably, so 'f' is used as the toggle key)
      +    if key == ord('f'):
      +        mouse_control_enabled = not mouse_control_enabled
      +        print(f"[Mouse Control] {'Enabled' if mouse_control_enabled else 'Disabled'}")
      +        time.sleep(200)  # debounce (ms)
      +
       
       cap.release()
       cv2.destroyAllWindows()

To make this code more usable, the mouse movement in the eye-tracking needs to become more stable. And if we also add a clicking feature, it would become usable on a computer

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions