Arabic Sign Language gesture detection using mediapipe and openCV
This system extends Kazuhito00's Hand Gesture Recognition by adding support for:
-
The complete 28-letter Arabic alphabet
-
3 functional gestures (Space, Delete, Clear)
-
Real-time text output with Arabic script rendering
✋ 28 Arabic Letters: Custom-trained gesture models for all Arabic characters
🛠 Utility Gestures:
- 👉 Space: Insert space between words
- ❌ Delete: Remove last character
- 🧹 Clear: Reset entire text
📜 Arabic Text Rendering: Proper RTL display with glyph shaping
⚡ Adjustable Sensitivity: Control detection speed via frame threshold
-
MediaPipe – Hand tracking
-
OpenCV – Camera processing & visualization
-
NumPy – Data handling
-
Model: CNN – Static gesture classification
For letter with index 23 in "keypoint_classifier_label.csv":
- (shift + f) "number +"
- Add 20 next to "+" index 0 to 9: add 0, index 10 to 19: add 10
- Run app
- Press k to enter training mode
- Press 2 (for 20 + 2 = 22) Note: CSV uses 0-based indexing while labels start from 1
- Make the gesture 20+ times
- Close app
- Open "keypoint_classification_EN" in Jupyter notebook & run all cells
- Training done ✅
Kazuhito00's Hand Gesture Recognition Repo: https://github.com/kinivi/hand-gesture-recognition-mediapipe
