Skip to content

eliaselhaddad/Thesis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Comparing Compression Techniques for Deep Learning Models

Table of Contents

Introduction

This project compares various model optimization techniques, specifically pruning, clustering, and quantization, to evaluate their impact on model size and performance.

Summary of Results

Model Size (KB) Training Accuracy Validation Accuracy Test Accuracy (Quantized)
Teacher Model 4075.12 99.55% 90.10% 91.57%
Pruned Model 1365.81 99.36% 89.23% 90.30%
Clustered Model 1365.78 98.85% 89.27% 90.03%

Visualization

Here are some visual comparisons of the model sizes and accuracies:

Model Sizes and Accuracies

Model Sizes and Accuracies

Individual Metrics

Size, Training Accuracy, Validation Accuracy

Detailed Analysis

  • Model Size Reduction: Both pruning and clustering reduced the model size by approximately 66%.
  • Maintaining Accuracy: Despite the reduction in size, the models retained high accuracy levels, with only a minor drop in validation accuracy.

Practical Applications

  • Mobile Devices and Wearables: Real-time analytics, health monitoring, predictive text input.
  • IoT Devices: Localized and immediate responses without relying on constant cloud communication.

Conclusion

This project demonstrated that pruning and clustering are effective techniques for model compression, achieving significant reductions in model size with minimal impact on accuracy.

How to Run the Project

  1. Clone the repository:
    git clone https://github.com/eliaselhaddad/Thesis.git
    cd Thesis
  2. Create and activate a virtual environment (optional but recommended):
    python -m venv env
    source env/bin/activate  # On Windows use `env\Scripts\activate`
  3. Install the required dependencies:
    pip install -r requirements.txt
  4. Follow the provided Jupyter notebooks to reproduce the results:
    jupyter notebook

Technical Report

For more detailed information, please refer to the Technical Report.pdf.

References

Contributing

Contributions are welcome! Please open an issue or submit a pull request if you have any suggestions or improvements.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published