
Hand gesture recognition is an exciting computer vision application that allows interaction with devices using hand movements. With OpenCV, we can create a simple yet effective hand gesture recognition system.
Step 1: Install OpenCV and Mediapipe
Ensure the required libraries are installed:
pip install opencv-python mediapipe numpy
Step 2: Import Libraries and Initialize Mediapipe
Mediapipe is a powerful library for real-time hand tracking.
import cv2
import mediapipe as mp
mp_hands = mp.solutions.hands
mp_draw = mp.solutions.drawing_utils
hands = mp_hands.Hands(min_detection_confidence=0.7, min_tracking_confidence=0.7)
Step 3: Capture Video Feed
Open a video stream to detect hands in real time:
cap = cv2.VideoCapture(0)
while cap.isOpened():
ret, frame = cap.read()
if not ret:
break
frame = cv2.flip(frame, 1) # Flip for mirror effect
rgb_frame = cv2.cvtColor(frame, cv2.COLOR_BGR2RGB)
results = hands.process(rgb_frame)
if results.multi_hand_landmarks:
for hand_landmarks in results.multi_hand_landmarks:
mp_draw.draw_landmarks(frame, hand_landmarks, mp_hands.HAND_CONNECTIONS)
cv2.imshow(“Hand Gesture Recognition”, frame)
if cv2.waitKey(1) & 0xFF == ord(‘q’):
break
cap.release()
cv2.destroyAllWindows()
Step 4: Recognizing Specific Gestures
By analyzing landmark positions, we can classify different gestures. Here’s an example of recognizing an open palm:
def is_open_palm(hand_landmarks):
thumb_tip = hand_landmarks.landmark[mp_hands.HandLandmark.THUMB_TIP].y
index_tip = hand_landmarks.landmark[mp_hands.HandLandmark.INDEX_FINGER_TIP].y
middle_tip = hand_landmarks.landmark[mp_hands.HandLandmark.MIDDLE_FINGER_TIP].y
ring_tip = hand_landmarks.landmark[mp_hands.HandLandmark.RING_FINGER_TIP].y
pinky_tip = hand_landmarks.landmark[mp_hands.HandLandmark.PINKY_TIP].y
return (index_tip < thumb_tip and middle_tip < thumb_tip and
ring_tip < thumb_tip and pinky_tip < thumb_tip)
Modify the video loop to check for gestures:
if results.multi_hand_landmarks:
for hand_landmarks in results.multi_hand_landmarks:
mp_draw.draw_landmarks(frame, hand_landmarks, mp_hands.HAND_CONNECTIONS)
if is_open_palm(hand_landmarks):
cv2.putText(frame, “Open Palm Detected”, (50, 50),
cv2.FONT_HERSHEY_SIMPLEX, 1, (0, 255, 0), 2)
Conclusion
Using OpenCV and Mediapipe, we can recognize hand gestures in real-time and map them to actions. Expand this by adding gesture-based commands for controlling applications, games, or IoT devices!