Skip to content
/ MHA-FS Public
forked from thieu1995/MHA-FS

Project for my college course Nature Inspired Computing

License

Notifications You must be signed in to change notification settings

aks1204/MHA-FS

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Arrhythmia Feature Selection using Meta-Heuristic Algorithms

Arrhythmia refers to an irregular heartbeat or an abnormal heart rhythm. The heart normally beats in a regular pattern, but in individuals with arrhythmia, the heart may beat too quickly, too slowly, or with an irregular pattern. Some arrhythmias may not cause any noticeable symptoms and may be harmless, while others can be more severe and potentially life-threatening.

There are several parameters/features that we might consider while classifying Arrhythmia into different types.

However, considering all features for Arrhythmia classification is computationally expensive while dealing with large datasets.

We have tried to optimize the problem of Arrhythmia classification by deciding upon which all features to select using Meta-Heuristic Algorithms. We have used Random Forest Classfier to report for the accuracy of selected features in the Arrhythmia Classification Task.

To learn more about Random Forest refer to the link given below.

Link

Dataset

Our dataset consists of 452 samples with 279 features and each of the sample has been assigned any one of the 16 classes shown below.

Table below shows the class with its corresponding labels.

Class label Class
1 Normal
2 Ischemic changes (Coronary Artery Disease)
3 Old Anterior Myocardial Infarction
4 Old Inferior Myocardial Infarction
5 Sinus tachycardy
6 Sinus bradycardy
7 Ventricular Premature Contraction (PVC)
8 Supraventricular Premature Contraction
9 Left bundle branch block
10 Right bundle branch block
11 1. degree AtrioVentricular block
12 2. degree AV block
13 3. degree AV block
14 Left ventricule hypertrophy
15 Atrial Fibrillation or Flutter
16 Others

We have split the samples in our dataset to 339 for training and 113 for testing.

Dependencies

First and foremost you need to install Python.

  1. Mealpy (https://github.com/thieu1995/mealpy)
  2. Permetrics (https://github.com/thieu1995/permetrics)
  3. Scikit-learn (https://scikit-learn.org/stable/index.html)
  4. Pandas (https://pandas.pydata.org/)
  5. Matplotlib (https://matplotlib.org/)

Setup environment

Pip

pip install -r requirements.txt

How to run

python -m src.models.mha_fs

We need to select the best features in dataset.

Our solution is a 1-D vector, each dimension representing an index of column in the dataset.

  • If it has value 1, meaning this column is selected for the model.
  • If it has value 0, meaning this column is not selected for the model.

So for each of the dimension we need to convert real value back to either 0 or 1.

  • The lower bound is 0 for all dimensions(floor of 0 is 0).
  • The upper bound is 1.99 for all dimensions(floor of 1.99 is 1).

Also if no column is selected we randomly choose any one column.

We have set our population size to 50 and have run our algorithms for 100 epochs.

Fitness Function

$$ \text{Fitness} = \text{Accuracy} $$

$$ \text{Accuracy} = {\text{Number of Correctly Classified Instances} \over \text{Total Number of Instances}} $$

Differential Evolution

To learn more about Differential Evolution refer to the link given below.

Link

We have used DE/rand/2/bin strategy for our Differenetial Evolution algorithm.

Our weighing factor is 0.8 and crossover rate/probability is 0.9.

The graphs below depict global best fitness value and local best fitness value for Differential Evolution algorithm as a function of number of epochs/iterations.

lbfc

gbfc

The above graphs would be plotted and saved in graphs/DE folder after our code has been run for Differential Evolution algorithm for 100 epochs.

Genetic Algorithm

To learn more about Genetic Algorithm refer to the link given below.

Link

We have chosen tournament selection method for selection. In tournament selection we have run tournament amongst 10 inviduals in our population.

We have chosen uniform crossover method with 0.95 as our crossover probability.

We have chosen multiple points for mutation with 0.025 as our mutation probability. We have chosen flip method for mutation wherein we have replaced 1 with 0 or vice versa for a particular dimension in our dataset.

The graphs below depict global best fitness value and local best fitness value for Genetic Algorithm as a function of number of epochs/iterations.

lbfc (1)

gbfc (1)

The above graphs would be plotted and saved in graphs/GA folder after our code has been run for Genetic Algorithm for 100 epochs.

Artificial Bee Colony

To learn more about Artificial Bee Colony refer to the link given below.

Link

We have set the limit for Number of trials before abandoning a food source(solution) as 5.

The graphs below depict global best fitness value and local best fitness value for Artificial Bee Colony algorithm as a function of number of epochs/iterations.

lbfc (2)

gbfc (2)

The above graphs would be plotted and saved in graphs/ABC folder after our code has been run for Artificial Bee Colony algorithm for 100 epochs.

Particle Swarm Optimization

To learn more about Particle Swarm Optimization refer to the link given below.

Link

We have set our local and global coefficients for velocity both as 2.05 and inertia/weight coefficient for the bird as a random value between 0.4 and 0.9.

The graphs below depict global best fitness value and local best fitness value for Particle Swarm Optimization algorithm as a function of number of epochs/iterations.

lbfc (3)

gbfc (3)

The above graphs would be plotted and saved in graphs/PSO folder after our code has been run for Particle Swarm Optimization algorithm for 100 epochs.

About

Project for my college course Nature Inspired Computing

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%