Skip to content

igygi/underwater-gesture-recognition

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

28 Commits
 
 
 
 
 
 
 
 

Repository files navigation

underwater-gesture-recognition

This repository contains the codes used in the paper "Underwater Gesture Recognition Using Classical Computer Vision and Deep Learning Techniques". The codes are divided into three groups: one for each model used in the paper.

1. Bag of Visual Words (BOVW):

  • Scripts (adopted from Kushal Vyas’ implementation of Bag of Visual Words: https://github.com/kushalvyas/Bag-of-Visual-Words-Python)

  • To test:

    1. Put train images in ./images/train folder; this is needed to get the classes to be used in testing
    2. Put test images in ./images/test folder
    3. Put kmeans_cluster_3.sav, mega_histogram_3.pkl, svm_train_3.pkl, and vstack_3.pkl in the same directory as Bag.py and helpers.py (Link to pretrained weights: https://drive.google.com/open?id=1hSaZwRpbtOqFYep7Z4jW2kkKizEHoVqO) 4. Run python Bag.py
  • Dependencies:
    -OpenCV 3.4.2.17 (OpenCV-contrib-python==3.4.2.17), scikit-learn 0.20.3

2. Histogram of Gradients (HOG):

  • To test:
    1. Put test images in ./images/test folder 2. Put hog_svm.joblib pre-trained model in ./models folder (Link to pretrained weights: https://drive.google.com/open?id=1pirGWIkZqWXBNSwQKuTdLpmMKowYrOUK)
    2. Create ./results/ folder, where the csv containing the target classes and predictions will be saved
    3. Run testing.py
  • To visualize sample correct and incorrect predictions per class:
    1. Create ./visualization/ folder, which will contain the sample images with predictions
    2. Run visualize_predictions.py
  • Dependencies:
    -OpenCV 4.1.0.25 (opencv-contrib-python==4.1.0.25), scikit-learn 0.21.0

3. ResNet50-CNN

Datasets

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages