Skip to content

abouhadid/Optimization-Project

Repository files navigation

Optimization-Project

IASD Master Optimization Project : Study of Gradient Descent Techniques

The aim of this project is to compare and study the different optimization techniques for a linear regression problem on a medium scale dataset.

The project is concerned with going beyond first-order techniques in machine learning. Algorithmic frameworks such as gradient descent and stochastic gradient are inherently first-order methods, in that they rely solely on first-order derivatives. Second-order methods, on the other hand, make use of higher-order information, either explicitly or implicitly. Although those techniques are widely used in scientific computing, their use in machine learning has yet to be generalized. The goal of this project is to illustrate the performance of these techniques on learning problems involving both synthetic and real data. The project is decomposed as follows:

  • Phase 1: Gradient Desent
  • Phase 2: Automatic Differentiation
  • Phase 3: Stochastic Gradient Descent
  • Phase 4: Convexity & Constrained Optimization
  • Phase 5: Proximal Gradient & LASSO
  • Phase 6: Large-Scale and Distributed Optimization
  • Phase 7: Advanced Topics on Gradient Descent

Link to the dataset used: https://www.kaggle.com/datasets/ujjwalchowdhury/energy-efficiency-data-set

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published