A beginner-friendly tutorial project that teaches TensorFlow.js and MediaPipe Hands through building an interactive gesture-controlled game.
- TensorFlow.js - Running machine learning models in the browser
- MediaPipe Hands - Real-time hand landmark detection
- Webcam Access - Using the
getUserMediaAPI - Video Processing - Drawing video feeds to HTML5 Canvas
- Real-time Interactions - Creating responsive gesture-based applications
- Open the starter/ directory
- Follow the TUTORIAL.md step-by-step
- Complete the TODOs in the code
- Reference completed/ if you get stuck
- Open the completed/ directory
- Open
index.htmlin a modern browser - Grant camera permissions
- Click "Start Game" and play!
air-juggler/
├── starter/ # Incomplete code with TODOs (start here!)
│ ├── index.html # HTML without TensorFlow scripts
│ ├── style.css # Complete styling (provided)
│ ├── game.js # Game boilerplate with TODOs
│ └── handTracking.js # Hand tracking boilerplate with TODOs
├── completed/ # Fully working code (reference)
│ ├── index.html
│ ├── style.css
│ ├── game.js
│ └── handTracking.js
└── README.md # This file
- Open the game - Load
index.htmlin a browser (Chrome, Firefox, Edge recommended) - Grant camera permission - Allow access when prompted
- Click "Start Game" - Wait for the ML model to load (~2-3 seconds)
- Move your hands - Position your hands in front of the camera
- Bounce the ball - Keep the ball in the air by hitting it with your hands!
- A ball falls due to gravity
- Your hands create invisible "paddles" that bounce the ball upward
- If the ball falls off the bottom of the screen, game over
- Score is based on how long you survive (in seconds)
- Basic JavaScript knowledge
- Understanding of async/await
- HTML5 Canvas basics (helpful but not required)
- Modern web browser with webcam
Requires a modern browser with:
- WebRTC support (for webcam access)
- ES6+ JavaScript support
- HTML5 Canvas support
- WebGL support (for GPU acceleration)
Tested on:
- Chrome 90+
- Firefox 88+
- Edge 90+
- Safari 14+
- HTML5 Canvas - Game rendering
- Vanilla JavaScript - No frameworks needed!
- TensorFlow.js - Machine learning framework
- MediaPipe Hands - Pre-trained hand detection model
- Detection runs at ~30 FPS - Good balance of accuracy and performance
- Rendering runs at 60 FPS - Smooth visuals
- Model loading - First load downloads ~10MB, then cached
- GPU acceleration - Automatically used when available
Camera not working?
- Ensure you've granted camera permissions
- Check that no other app is using your camera
- Try refreshing the page
- Check browser console for errors
Model loading slowly?
- First load downloads the MediaPipe Hands model
- Subsequent loads use browser cache
- Check your internet connection
Hands not detected?
- Ensure good lighting conditions
- Keep hands clearly visible to camera
- Try moving closer or adjusting camera angle
- Make sure hands are within the camera frame
Low FPS/Performance?
- Close other browser tabs
- Check if GPU acceleration is enabled
- Try using a different browser
- Reduce
maxHandsfrom 2 to 1 in configuration
Once you've completed the tutorial, try these challenges:
- Multiple balls - Juggle 2-3 balls instead of 1
- Difficulty levels - Adjust gravity and bounce velocity
- Finger tracking - Detect individual fingers instead of palm
- Gesture recognition - Recognize specific hand gestures
- Sound effects - Add audio feedback when bouncing
- Power-ups - Add special items that appear randomly
- Leaderboard - Save high scores to localStorage
- Multiplayer - Two-player mode with different colored balls
- TensorFlow.js Documentation
- MediaPipe Hands Guide
- Hand Pose Detection API
- WebRTC getUserMedia
- HTML5 Canvas Tutorial
Built as part of the Codédex Project Tutorials.
MIT License - Feel free to use this code for learning and teaching!