profile
viewpoint
Kaiming He KaimingHe Facebook http://kaiminghe.com/ Research Scientist at FAIR

KaimingHe/deep-residual-networks 5206

Deep Residual Learning for Image Recognition

ShaoqingRen/faster_rcnn 2243

Faster R-CNN

KaimingHe/resnet-1k-layers 717

Deep Residual Networks with 1K Layers

ShaoqingRen/caffe 102

Caffe fork that supports SPP_net or faster R-CNN

issue commentHobbitLong/CMC

Estimating normalization factor Z

0.99 for updating Z works well. In ImageNet-1K, MoCo with NCE is ~2% worse than MoCo with InfoNCE, similar to the case of the memory bank counterpart.

KaimingHe

comment created time in 2 months

issue commentHobbitLong/CMC

Estimating normalization factor Z

You reported a low number of MoCo with the NCE loss. This is because your implementation of NCE is problematic and correcting it should gives a more reasonable MoCo w/ NCE number.

KaimingHe

comment created time in 2 months

issue openedHobbitLong/CMC

Estimating normalization factor Z

https://github.com/HobbitLong/CMC/blob/0f72b18a99e35bf2c2f0001656c2b33365b50cf6/NCE/NCEAverage.py#L189

This one-time estimation is problematic, especially if the dictionary is not random noise. Computing Z as a moving average of this would give a more reasonable result.

created time in 2 months

startedfacebookresearch/detectron2

started time in 4 months

more