Skip to content

Latest commit

 

History

History
43 lines (30 loc) · 1.27 KB

before_start.md

File metadata and controls

43 lines (30 loc) · 1.27 KB

Getting Started

we visualize our training details via wandb (https://wandb.ai/site).

visualization

  1. you'll need to login

    $ wandb login

    you can find you API key in (https://wandb.ai/authorize), and copy & paste it in terminal.

  2. you can (optionally) add the key to the "code/config/config.py"for the server use, with

    C.wandb_key = ""

training

our code is trained using one nvidia 3090 GPU, but our code also supports distributed data parallel mode in pytorch. We set batch_size=8 for all the experiments, with learning rate 1e-5 and 900 * 900 resolution.

checkpoints

we follow Meta-OoD and use the deeplabv3+ checkpoint in here. you'll need to put it in "ckpts/pretrained_ckpts" directory, and please note that downloading the checkpoint before running the code is necessary for our approach.

for training, simply execute

$ python code/main.py 

inference

please download our checkpoint from here and specify the checkpoint path ("ckpts/pebal_weight_path") in config file.

python code/test.py