Ask questionsCan't use one GPU?
If I want to use one or two GPUs in one server? What should I do?
Answer questions KaimingHe
We exploit the fact that by default BN is split into multiple GPUs in which mean/std is independently computed. To run on one GPU, you may implement BN split along the N (batch) dimension to mimic this effect (see NaiveBatchNorm in Detectron2). You also need to change the lr (e.g., linearly) if you change the batchsize to fit memory. To run on 2 GPUs, try
--lr 0.0075 --batch-size 64.
Related questionsNo questions were found.