This repository contains the code and documentation for a hand gesture-controlled LED system. The project utilizes computer vision techniques to detect hand gestures and control the illumination of LEDs connected to an Arduino board.
The Hand Gesture-Controlled LED project aims to create an interactive system where users can control the state of LEDs using hand gestures captured by a webcam. The system employs computer vision algorithms for hand detection and gesture recognition, coupled with Arduino-based LED control. Users can perform predefined gestures, such as thumbs-up or peace sign, to toggle the LEDs on or off in real-time.
- Arduino Uno board
- LEDs (quantity as desired)
- USB webcam
- Breadboard
- Jumper wires
- Python 3
- Arduino IDE
- OpenCV library
- cvzone library (specifically the HandTrackingModule)
- pySerial library
- Connect the LEDs to the digital pins of the Arduino board using jumper wires.
- Connect the Arduino board to your computer via USB.
- Install the required Python libraries: content_copy Use code with caution.
pip install opencv-python cvzone pyserial
- Upload the Arduino code (
ARDUINO_CODE.ino
) to the Arduino board using the Arduino IDE. - Update the
com_port
variable in the Python code (PYTHON_CODE.py
) to match the serial port of your Arduino board.
- Run the Python script: content_copy Use code with caution.
python main.py
- The system will initialize the webcam and hand detection module.
- Perform hand gestures in front of the webcam.
- The LEDs will illuminate or extinguish based on the recognized gestures.
A detailed project report (project_report.pdf
) is included in the repository, providing comprehensive information on the project's objectives, methodology, results, and analysis.
- Implement additional hand gestures for more complex LED control.
- Develop a graphical user interface (GUI) for improved user interaction.
- Integrate with IoT platforms for remote control and automation features.
- Explore the use of depth-sensing cameras for enhanced gesture recognition accuracy.
This project was inspired by the growing field of human-computer interaction and the potential of hand gesture recognition technology. content_copy Use code with caution.