Skip to content

Latest commit

 

History

History
155 lines (135 loc) · 5.05 KB

README.md

File metadata and controls

155 lines (135 loc) · 5.05 KB

Exact Combinatorial Optimization with Graph Convolutional Neural Networks (Ecole+Pytorch+Pytorch Geometric reimplementation)

This is the official reimplementation of the proposed GNN model from the paper "Exact Combinatorial Optimization with Graph Convolutional Neural Networks" NeurIPS 2019 paper using the Ecole library. This reimplementation also makes use Pytorch instead of Tensorflow, and of Pytorch Geometric for handling the GNN. As a consequence, much of the code is now simplified. Slight discrepancies in results from the original implementation is to be expected.

As mentionned, this repo only implements the GNN model. For comparisons with the other ML competitors (ExtraTrees, LambdaMART and SVMRank), please see the original implementation here.

Authors

Maxime Gasse, Didier Chételat, Nicola Ferroni, Laurent Charlin and Andrea Lodi.

Installation

Our recommended installation uses the Conda package manager. The previous implementation required you to compile a patched version of SCIP and PySCIPOpt using Cython. This is not required anymore, as Conda packages are now available, which are dependencies of the Ecole conda package itself.

Instructions: Install Ecole, Pytorch and Pytorch Geometric using conda. At the time of writing these installation instructions, this can be accomplished by running:

conda install ecole
conda install pytorch torchvision torchaudio cudatoolkit=10.2 -c pytorch
conda install pyg -c pyg -c conda-forge

Please refer to the most up to date installation instructions for Ecole, Pytorch and Pytorch Geometric if you encounter any errors.

Benchmarks

For every benchmark in the paper, we describe the code for running the experiments, and the results compared to the original implementation.

Set Covering

# Generate MILP instances
python 01_generate_instances.py setcover
# Generate supervised learning datasets
python 02_generate_dataset.py setcover -j 4  # number of available CPUs
# Training
for i in {0..4}
do
    python 03_train_gnn.py setcover -s $i
done
# Evaluation
python 04_evaluate.py setcover
Easy Medium Hard
Time Nodes Time Nodes Time Nodes
SCIP default
GNN (original)
GNN (reimplementation)

Combinatorial Auction

# Generate MILP instances
python 01_generate_instances.py cauctions
# Generate supervised learning datasets
python 02_generate_dataset.py cauctions -j 4  # number of available CPUs
# Training
for i in {0..4}
do
    python 03_train_gnn.py cauctions -s $i
done
# Evaluation
python 04_evaluate.py cauctions

Capacitated Facility Location

# Generate MILP instances
python 01_generate_instances.py facilities
# Generate supervised learning datasets
python 02_generate_dataset.py facilities -j 4  # number of available CPUs
# Training
for i in {0..4}
do
    python 03_train_gnn.py facilities -s $i
done
# Evaluation
python 04_evaluate.py facilities

Maximum Independent Set

# Generate MILP instances
python 01_generate_instances.py indset
# Generate supervised learning datasets
python 02_generate_dataset.py indset -j 4  # number of available CPUs
# Training
for i in {0..4}
do
    python 03_train_gnn.py indset -s $i
done
# Evaluation
python 04_evaluate.py indset

Citation

Please cite our paper if you use this code in your work.

@inproceedings{conf/nips/GasseCFCL19,
  title={Exact Combinatorial Optimization with Graph Convolutional Neural Networks},
  author={Gasse, Maxime and Chételat, Didier and Ferroni, Nicola and Charlin, Laurent and Lodi, Andrea},
  booktitle={Advances in Neural Information Processing Systems 32},
  year={2019}
}

Questions / Bugs

Please feel free to submit a Github issue if you have any questions or find any bugs. We do not guarantee any support, but will do our best if we can help.