Skip to content

Latest commit

 

History

History
57 lines (36 loc) · 1.6 KB

README.md

File metadata and controls

57 lines (36 loc) · 1.6 KB

Impulse Classification Network (ICN) for video Head Impulse Test


This research proposes the Impulse Classification Network (ICN) using 1D Convolutional Neural Network (1D CNN) that able to detect noisy data and classify human VOR impulses. ICN is a high-performance classification method that works on a patient's video Head Impulse Test (vHIT) impulse data by identifying abnormalities and artifacts. Our ICN method found actual classes of patient’s impulses with 95% accuracy. Paper link

We provide train and test python files. We created our dataset which came from the ICS goggle device.

Lateral canal test type: left side vHIT data with a) normal and b) artifact impulses

Four type of classes

  • Normal - 1081
  • Abnormal - 804
  • Artifact_phase_shift - 797
  • Artifact_high_gain - 1115

Total 3797 impulses

Training part

python train.py -l labels.pickle

Test part

python test.py -m ./data/yourmodel.h5 -l lables.pickle -i impulse.csv

Result of test part

We can check the VOR gain values and predict the class of each impulse.

Artifact_high_gain – 100% Normal class is detected 93.44% Artifact_phase_shift – 98.72%

Dataset _labels_ready_new.csv - classes _left_ready.csv - impulses >>> row number 1:EYE_data 2:HEAD_data, 3:EYE_data 4:HEAD_data and etc ...