Real-time gesture detection using MediaPipe with parallel meme display.
- 12 Gesture Detection: Smirk, Wink, Speed, Patrick, Thinking, Shush, Giggle, Cut It Out, Shock, LeBron Scream, Shaq T, Surprise
- Split-Screen Display: Webcam feed (left) + corresponding meme (right)
- GIF Support: Animated memes loop automatically
- Python 3.11+
- Webcam
- Git
git clone https://github.com/razancodes/emote-meme.git
cd emote-memeOn macOS/Linux:
python3 -m venv venv
source venv/bin/activateOn Windows:
python -m venv venv
venv\Scripts\activatepip install -r requirements.txtThe ./images/ folder should already exist with default memes. If not, create it:
mkdir imagesAdd these files to the ./images/ folder:
| Gesture | Filename |
|---|---|
| Smirk | smirk-meme.jpg |
| Wink | monkey-wink.jpg |
| Shaq T | shaq.jpg |
| Patrick | patrick-meme.jpg |
| Speed | speed.gif |
| Shock | shock-guy-meme.jpg |
| Cut It Out | cut-it.gif |
| Shush | dog-shush.jpg |
| Thinking | monkey-thinking.jpg |
| LeBron | lebron-scream.jpg |
| Giggle | baby-meme-giggle.gif |
| Idle | idle.jpg |
python main.py- The application will automatically download required MediaPipe models on first run
- Your webcam feed will appear on the left, with the corresponding meme on the right
- Perform gestures to trigger different memes
- Press 'q' to quit
Webcam not detected:
- Ensure your webcam is connected and not in use by another application
- Try changing the camera index in
main.py(line ~750):cap = cv2.VideoCapture(0)βcap = cv2.VideoCapture(1)
Model download fails:
- Check your internet connection
- Manually download models from:
- Place them in
./models/folder
Gestures not detecting:
- Ensure good lighting
- Keep your face and hands visible in the frame
- Adjust detection thresholds in the corresponding gesture functions
- Smirk π - Asymmetric smile
- Wink π - One eye closed
- Speed β‘ - Squint + pursed lips
- Patrick β - Jaw drop (no hands)
- Thinking π€ - Finger at mouth corner + mouth open
- Shush π€« - Finger on lips + face sideways
- Giggle π€ - Hand covering mouth
- Cut It Out β - Flat hand at neck level
- Shock π± - Hands on head + mouth open
- LeBron π - Scream + hands down
- Shaq T β±οΈ - T-shape timeout gesture
We welcome contributions to improve or replace memes! Follow these guidelines when submitting meme updates:
- Check existing memes in the
./images/folder to avoid duplicates - Ensure your meme aligns with its corresponding gesture
- Verify the meme is appropriate and high-quality
-
Fork the repository and create a new branch for your meme update:
git checkout -b update/gesture-name-meme
-
Add or replace your meme file in the
./images/folder:- Use the exact filename from the "Required Meme Files" table
- Supported formats:
.jpg,.png,.gif - For GIFs: Ensure they loop smoothly and aren't excessively large
- Image resolution: Recommended 500x500px or larger (will be resized to fit)
-
Test locally before submitting:
python main.py # Verify the gesture displays your new meme correctly -
Commit your changes:
git add images/your-meme-file git commit -m "Update: Replace [gesture] meme with [brief description]" -
Push and open a Pull Request with:
- Title:
Update [Gesture Name] meme - Description: Explain why you're updating the meme (better quality, funnier, more relevant, etc.)
- PR Body: Include a screenshot or preview if possible
- Title:
- Image is clear and high-quality
- Filename matches the table above exactly
- File size is reasonable (GIFs < 5MB, JPGs < 2MB)
- Tested with the gesture detection locally
- No copyright/licensing issues
Want to add a new gesture or improve existing detection? Here's how to update the gesture detection functions in main.py:
Gesture detection functions are located in main.py around lines 280-650. Each function typically:
- Takes face landmarks, hand landmarks, and/or blendshapes as parameters
- Returns a boolean indicating if the gesture is detected
- Uses distance calculations and threshold values
By default, tracking dots and hand skeletons are disabled for a cleaner UI. To enable them for debugging:
For Face Landmarks (tracking dots on face):
Find this line in the main() function (around line 815):
# draw_face_landmarks(frame, face_landmarks) # Disabled: no tracking dotsUncomment it to:
draw_face_landmarks(frame, face_landmarks) # Enabled: shows face tracking dotsFor Hand Skeleton (bone structure visualization):
Find this line in the main() function (around line 837):
# draw_hand_landmarks(frame, hand_landmarks_list) # Disabled: no tracking dotsUncomment it to:
draw_hand_landmarks(frame, hand_landmarks_list) # Enabled: shows hand skeletonLet's create a new thumbs-up gesture as an example:
Step A: Add the meme to the mapping (around line 60):
GESTURE_MEME_MAP = {
# ... existing gestures ...
"thumbs_up": "thumbs-up-meme.jpg", # Add this line
}Step B: Create the detection function (add around line 600):
def detect_thumbs_up_gesture(hand_landmarks_list: list) -> bool:
"""
Detect "Thumbs Up" gesture: thumb extended upward, other fingers curled.
Hand landmarks:
- Thumb tip: index 4
- Thumb IP: index 3
- Index finger tip: index 8
- Index finger MCP: index 5
Requirements:
- Thumb tip higher than thumb IP (thumb pointing up)
- Index finger tip lower than MCP (finger curled)
"""
if not hand_landmarks_list:
return False
for hand in hand_landmarks_list:
thumb_tip = hand[4]
thumb_ip = hand[3]
index_tip = hand[8]
index_mcp = hand[5]
# Check if thumb is extended upward
thumb_extended = thumb_tip.y < thumb_ip.y
# Check if index finger is curled (tip below knuckle)
index_curled = index_tip.y > index_mcp.y
if thumb_extended and index_curled:
return True
return FalseStep C: Add detection to main loop (around line 880):
# Inside the main() function, after other gesture checks:
if detect_thumbs_up_gesture(hand_landmarks_list):
detections.append(("Thumbs Up! π", (0, 255, 0)))
if active_gesture is None:
active_gesture = "thumbs_up"Step D: Test with visual debugging enabled:
# Enable hand skeleton to see landmark positions
draw_hand_landmarks(frame, hand_landmarks_list)Run the app and perform the thumbs-up gesture. You'll see:
- Hand skeleton with numbered landmark points
- Detection label appears when successful
Step E: Fine-tune threshold values: If detection is too sensitive/insensitive, adjust the conditions:
# Make thumb requirement stricter
thumb_extended = thumb_tip.y < (thumb_ip.y - 0.05)
# Add middle finger check
middle_tip = hand[12]
middle_mcp = hand[9]
middle_curled = middle_tip.y > middle_mcp.y
if thumb_extended and index_curled and middle_curled:
return TrueFace Landmarks (468 total points):
- Nose tip:
1 - Upper lip:
13 - Lower lip:
14 - Left mouth corner:
61 - Right mouth corner:
291 - Left eye outer:
33 - Right eye outer:
263 - Left face edge:
234 - Right face edge:
454
Hand Landmarks (21 points per hand):
- Wrist:
0 - Thumb:
1, 2, 3, 4(tip at 4) - Index:
5, 6, 7, 8(tip at 8) - Middle:
9, 10, 11, 12(tip at 12) - Ring:
13, 14, 15, 16(tip at 16) - Pinky:
17, 18, 19, 20(tip at 20)
Blendshapes (52 facial expressions):
"jawOpen"- Mouth openness (0-1)"mouthSmileLeft"/"mouthSmileRight"- Smile asymmetry"eyeBlinkLeft"/"eyeBlinkRight"- Eye closure"browInnerUp"- Eyebrow raise
Full reference: https://storage.googleapis.com/mediapipe-assets/documentation/mediapipe_blendshapes.png
When submitting a PR for a new gesture or detection improvement:
- Gesture function is well-documented with clear requirements
- Function includes parameter types and return type hints
- Tested with visual debugging enabled (dots/skeleton)
- Meme file added to
./images/folder - Meme mapping updated in
GESTURE_MEME_MAP - Detection call added to main loop with appropriate priority
- False positive rate is acceptable (doesn't trigger on similar gestures)
- Threshold values are commented and explained
- PR includes before/after video demonstration
Gesture not triggering:
- Enable
draw_hand_landmarks()ordraw_face_landmarks() - Print landmark values to console:
print(f"Thumb Y: {thumb_tip.y}, IP Y: {thumb_ip.y}")
- Lower threshold values
False positives:
- Add more restrictive conditions
- Require multiple landmarks to meet criteria
- Increase threshold values
- Add blendshape requirements
Gesture conflicts with others:
- Check gesture priority order in main loop
- Add exclusion conditions (e.g., "only trigger if X gesture is NOT active")
Thank you for helping improve the AI Meme Emote Detector! π¬
