How to define my own sampler in ddp? #4680
Replies: 7 comments
-
You can add your sampler to |
Beta Was this translation helpful? Give feedback.
-
Following up on this, custom ddp samplers take |
Beta Was this translation helpful? Give feedback.
-
You can define it in a class inherited from pl.LightningDataModule. global_rank is set before this. I have try it and no error produce. |
Beta Was this translation helpful? Give feedback.
-
Could you provide a sample code for this to see if we can reproduce results with no error, specifically including from where global_rank is being obtained to be passed into the sampler? From what I could tell, |
Beta Was this translation helpful? Give feedback.
-
Global_rank is set in distributedSampler. |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
DistributedSampler is triggered at the same time by different. They have different rank(refer to self.rank in DistributedSampler). According this rank, different gpu samples different data. |
Beta Was this translation helpful? Give feedback.
-
Normally, i will defind my own sampler in dataloader. When using ddp as the accelerator, i don't know where to place it.
If i define a class inherited from pl.LightningDataModule and have my own sampler class inherited from DistributedSampler, which function is right to place my own sampler, def prepare(self) or def setup(self, stage) or def train_dataloader(self) ?
Beta Was this translation helpful? Give feedback.
All reactions