profile
viewpoint
Ross Girshick rbgirshick UC Berkeley

rbgirshick/py-faster-rcnn 7121

Faster R-CNN (Python implementation) -- see https://github.com/ShaoqingRen/faster_rcnn for the official MATLAB version

rbgirshick/fast-rcnn 2878

Fast R-CNN

ShaoqingRen/faster_rcnn 2380

Faster R-CNN

rbgirshick/rcnn 2090

R-CNN: Regions with Convolutional Neural Network Features

rbgirshick/yacs 627

YACS -- Yet Another Configuration System

rbgirshick/voc-dpm 554

Object detection system using deformable part models (DPMs) and latent SVM (voc-release5). You may want to use the latest tarball on my website. The github code may include code changes that have not been tested as thoroughly and will not necessarily reproduce the results on the website.

rbgirshick/caffe-fast-rcnn 344

Caffe fork that supports Fast R-CNN

rbgirshick/DeepPyramid 122

Deep feature pyramids for various computer vision algorithms (DPMs, pyramid R-CNN, etc.)

ShaoqingRen/caffe 102

Caffe fork that supports SPP_net or faster R-CNN

rbgirshick/star-cascade 70

Cascade Object Detection with Deformable Part Models – Add-on package for voc-release4.01

issue closedfacebookresearch/detectron2

Anchor generator fails broadcasting

Instructions To Reproduce the 🐛 Bug:

Config file like:

  ANCHOR_GENERATOR:
    SIZES: [[[32, 64], [64, 128], [128, 256], [256, 512], [512, 1024]]]
    ASPECT_RATIOS: [[0.5, 1.0, 2.0]]
  RPN:
    IN_FEATURES: ["p2", "p3", "p4", "p5", "p6"]

Log

Traceback (most recent call last):
  File "asbestos/train.py", line 71, in <module>
    args=(args,),
  File "/detectron2_repo/detectron2/engine/launch.py", line 62, in launch
    main_func(*args)
  File "asbestos/train.py", line 36, in main
    trainer = Trainer(cfg)
  File "/detectron2_repo/detectron2/engine/defaults.py", line 282, in __init__
    model = self.build_model(cfg)
  File "/detectron2_repo/detectron2/engine/defaults.py", line 440, in build_model
    model = build_model(cfg)
  File "/detectron2_repo/detectron2/modeling/meta_arch/build.py", line 21, in build_model
    model = META_ARCH_REGISTRY.get(meta_arch)(cfg)
  File "/detectron2_repo/detectron2/config/config.py", line 181, in wrapped
    explicit_args = _get_args_from_config(from_config_func, *args, **kwargs)
  File "/detectron2_repo/detectron2/config/config.py", line 236, in _get_args_from_config
    ret = from_config_func(*args, **kwargs)
  File "/detectron2_repo/detectron2/modeling/meta_arch/rcnn.py", line 78, in from_config
    "proposal_generator": build_proposal_generator(cfg, backbone.output_shape()),
  File "/detectron2_repo/detectron2/modeling/proposal_generator/build.py", line 24, in build_proposal_generator
    return PROPOSAL_GENERATOR_REGISTRY.get(name)(cfg, input_shape)
  File "/detectron2_repo/detectron2/config/config.py", line 181, in wrapped
    explicit_args = _get_args_from_config(from_config_func, *args, **kwargs)
  File "/detectron2_repo/detectron2/config/config.py", line 236, in _get_args_from_config
    ret = from_config_func(*args, **kwargs)
  File "/detectron2_repo/detectron2/modeling/proposal_generator/rpn.py", line 242, in from_config
    ret["anchor_generator"] = build_anchor_generator(cfg, [input_shape[f] for f in in_features])
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 380, in build_anchor_generator
    return ANCHOR_GENERATOR_REGISTRY.get(anchor_generator)(cfg, input_shape)
  File "/detectron2_repo/detectron2/config/config.py", line 182, in wrapped
    init_func(self, **explicit_args)
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 116, in __init__
    self.cell_anchors = self._calculate_anchors(sizes, aspect_ratios)
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 132, in _calculate_anchors
    self.generate_cell_anchors(s, a).float() for s, a in zip(sizes, aspect_ratios)
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 132, in <listcomp>
    self.generate_cell_anchors(s, a).float() for s, a in zip(sizes, aspect_ratios)
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 199, in generate_cell_anchors
    area = size ** 2.0
TypeError: unsupported operand type(s) for ** or pow(): 'list' and 'float'

Expected behavior:

Apply: [34, 64], [0.5, 1, 2.0], "p2" [64, 128], [0.5, 1, 2.0], "p3" etc.

Environment:

Last detectron2 version.

Extra:

If this in intended, then the docs are wrong or not enough specific: https://detectron2.readthedocs.io/modules/config.html

closed time in 9 minutes

claverru

issue commentfacebookresearch/detectron2

Anchor generator fails broadcasting

Nevermind, I'm the one bugged.

claverru

comment created time in 9 minutes

issue openedfacebookresearch/detectron2

Anchor generator fails broadcasting

Instructions To Reproduce the 🐛 Bug:

Config file like:

  ANCHOR_GENERATOR:
    SIZES: [[[32, 64], [64, 128], [128, 256], [256, 512], [512, 1024]]]
    ASPECT_RATIOS: [[0.5, 1.0, 2.0]]
  RPN:
    IN_FEATURES: ["p2", "p3", "p4", "p5", "p6"]

Log

Traceback (most recent call last):
  File "asbestos/train.py", line 71, in <module>
    args=(args,),
  File "/detectron2_repo/detectron2/engine/launch.py", line 62, in launch
    main_func(*args)
  File "asbestos/train.py", line 36, in main
    trainer = Trainer(cfg)
  File "/detectron2_repo/detectron2/engine/defaults.py", line 282, in __init__
    model = self.build_model(cfg)
  File "/detectron2_repo/detectron2/engine/defaults.py", line 440, in build_model
    model = build_model(cfg)
  File "/detectron2_repo/detectron2/modeling/meta_arch/build.py", line 21, in build_model
    model = META_ARCH_REGISTRY.get(meta_arch)(cfg)
  File "/detectron2_repo/detectron2/config/config.py", line 181, in wrapped
    explicit_args = _get_args_from_config(from_config_func, *args, **kwargs)
  File "/detectron2_repo/detectron2/config/config.py", line 236, in _get_args_from_config
    ret = from_config_func(*args, **kwargs)
  File "/detectron2_repo/detectron2/modeling/meta_arch/rcnn.py", line 78, in from_config
    "proposal_generator": build_proposal_generator(cfg, backbone.output_shape()),
  File "/detectron2_repo/detectron2/modeling/proposal_generator/build.py", line 24, in build_proposal_generator
    return PROPOSAL_GENERATOR_REGISTRY.get(name)(cfg, input_shape)
  File "/detectron2_repo/detectron2/config/config.py", line 181, in wrapped
    explicit_args = _get_args_from_config(from_config_func, *args, **kwargs)
  File "/detectron2_repo/detectron2/config/config.py", line 236, in _get_args_from_config
    ret = from_config_func(*args, **kwargs)
  File "/detectron2_repo/detectron2/modeling/proposal_generator/rpn.py", line 242, in from_config
    ret["anchor_generator"] = build_anchor_generator(cfg, [input_shape[f] for f in in_features])
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 380, in build_anchor_generator
    return ANCHOR_GENERATOR_REGISTRY.get(anchor_generator)(cfg, input_shape)
  File "/detectron2_repo/detectron2/config/config.py", line 182, in wrapped
    init_func(self, **explicit_args)
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 116, in __init__
    self.cell_anchors = self._calculate_anchors(sizes, aspect_ratios)
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 132, in _calculate_anchors
    self.generate_cell_anchors(s, a).float() for s, a in zip(sizes, aspect_ratios)
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 132, in <listcomp>
    self.generate_cell_anchors(s, a).float() for s, a in zip(sizes, aspect_ratios)
  File "/detectron2_repo/detectron2/modeling/anchor_generator.py", line 199, in generate_cell_anchors
    area = size ** 2.0
TypeError: unsupported operand type(s) for ** or pow(): 'list' and 'float'

Expected behavior:

Applying (e.g.): [34, 64], [0.5, 1, 2.0], "p2"

Environment:

Last detectron2 version.

Extra:

If this in intended, then the docs are wrong or not enough specific: https://detectron2.readthedocs.io/modules/config.html

created time in 21 minutes

issue commentfacebookresearch/detectron2

Do you support batch inference?

Hi, we got batch inference to work, but inference time grows linearly with batch size. Any idea why?

Hey, what was the final code that you used for batch inference? thank you!

zhaoxuyan

comment created time in 38 minutes

issue openedfacebookresearch/detectron2

Is ResizeShortestEdge a data augmentation method or just resize the image in detectron2?

❓ How to do something using detectron2

Describe what you want to do, including: As far as I know, flip and ResizeShortestEdge are basic data augmentation if we do not control and adjust anything. So, my question is that is ResizeShortestEdge a data augmentation method or just resize the image in detectron2?

created time in an hour

push eventfacebookresearch/phyre

Anton Bakhtin

commit sha ffe3bc39a2b61c34faca8ef22b69d7d51cd5de4b

Bump phyre version (#41) * Make dev sets more representatibe. Kill old eval setups * Bump phyre version * Update wheel builds on CircleCI

view details

Anton Bakhtin

commit sha bd179c76705b1aae4f5db6c703c85074d46067f3

Merge remote-tracking branch 'origin/master' into ogre

view details

push time in 2 hours

fork VL914/py-faster-rcnn

Faster R-CNN (Python implementation) -- see https://github.com/ShaoqingRen/faster_rcnn for the official MATLAB version

fork in 2 hours

startedrbgirshick/py-faster-rcnn

started time in 2 hours

issue commentfacebookresearch/detectron2

Do you support batch inference?

Nice table. The last column heading should be Time / Frame (and in milliseconds preferably) rather than FPS.

zhaoxuyan

comment created time in 3 hours

issue openedfacebookresearch/detectron2

how do the output of using COCO evaluator looks like when doing keypoints detection.

Instructions To Reproduce the 🐛 Bug:

  1. Full runnable code or full changes you made:
If making changes to the project itself, please use output of the following command:
git rev-parse HEAD; git diff

<put code or diff here>
```evaluator = COCOEvaluator("keypoints_test", cfg, False, output_dir="/Users/cijianchao/Desktop/minor_thesis/code/my_project/evaluation_output")
val_loader = build_detection_test_loader(cfg, "keypoints_test")
print(inference_on_dataset(predictor.model, val_loader, evaluator))
2. What exact command you run:
3. __Full logs__ you observed:
```I am doing keypoint detection with detectron2. But I have little bit confusions of the format of the output. From the instruction, I will be given the values of AP, AP when OKS=0.5, AP when OKS=0.75 and some thing else. In my result, I do got that result of AP, AP0.5, and AP 0.75, but I noticed, the values of 0.5 and 0.75 are for IoU instead for OKS. So, I am not sure if it is correct.
<put logs here>
```Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.523
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets=100 ] = 0.901
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets=100 ] = 0.645
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = -1.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = -1.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.523
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=  1 ] = 0.560
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 10 ] = 0.560
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets=100 ] = 0.560
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= small | maxDets=100 ] = -1.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets=100 ] = -1.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets=100 ] = 0.560
[11/30 16:08:24 d2.evaluation.coco_evaluation]: Evaluation results for bbox: 
|   AP   |  AP50  |  AP75  |  APs  |  APm  |  APl   |
|:------:|:------:|:------:|:-----:|:-----:|:------:|
| 52.298 | 90.099 | 64.498 |  nan  |  nan  | 52.298 |
[11/30 16:08:24 d2.evaluation.coco_evaluation]: Some metrics cannot be computed and is shown as NaN.
Loading and preparing results...
DONE (t=0.00s)
creating index...
index created!
Running per image evaluation...
Evaluate annotation type *keypoints*
COCOeval_opt.evaluate() finished in 0.00 seconds.
Accumulating evaluation results...
COCOeval_opt.accumulate() finished in 0.00 seconds.
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=   all | maxDets= 20 ] = 0.865
 Average Precision  (AP) @[ IoU=0.50      | area=   all | maxDets= 20 ] = 0.901
 Average Precision  (AP) @[ IoU=0.75      | area=   all | maxDets= 20 ] = 0.901
 Average Precision  (AP) @[ IoU=0.50:0.95 | area=medium | maxDets= 20 ] = -1.000
 Average Precision  (AP) @[ IoU=0.50:0.95 | area= large | maxDets= 20 ] = 0.865
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=   all | maxDets= 20 ] = 0.880
 Average Recall     (AR) @[ IoU=0.50      | area=   all | maxDets= 20 ] = 0.900
 Average Recall     (AR) @[ IoU=0.75      | area=   all | maxDets= 20 ] = 0.900
 Average Recall     (AR) @[ IoU=0.50:0.95 | area=medium | maxDets= 20 ] = -1.000
 Average Recall     (AR) @[ IoU=0.50:0.95 | area= large | maxDets= 20 ] = 0.880
[11/30 16:08:24 d2.evaluation.coco_evaluation]: Evaluation results for keypoints: 
|   AP   |  AP50  |  AP75  |  APm  |  APl   |
|:------:|:------:|:------:|:-----:|:------:|
| 86.480 | 90.099 | 90.099 |  nan  | 86.480 |
4. please simplify the steps as much as possible so they do not require additional resources to
	 run, such as a private dataset.

## Expected behavior:

If there are no obvious error in "what you observed" provided above,
please tell us the expected behavior.

## Environment:

Provide your environment information using the following command:

wget -nc -q https://github.com/facebookresearch/detectron2/raw/master/detectron2/utils/collect_env.py && python collect_env.py


If your issue looks like an installation issue / environment issue,
please first try to solve it yourself with the instructions in
https://detectron2.readthedocs.io/tutorials/install.html#common-installation-issues

created time in 3 hours

issue commentfacebookresearch/detectron2

Do you support batch inference?

@riokaa You mean with batch size 8, it's 28s per frame and not 28s for the entire batch right? (By the way, are you on CPU ad not GPU, because 28 is super slow!) We did some more experiments. Looks like gains from batching comes only at higher batch sizes. See this link: apache/incubator-mxnet#6396 So for real-time inference, if you have to treat frames as fast as they come (say 5 or 10 fps), and it's not really possible to wait and make a big batch, batching doesn't really help. This is my conclusion so far.

Sorry for my unclear expression. I try to convert my experiment result to a list:

Batch size Epoch Total frame count Time cost FPS
1 256 256 31s 0.12
8 32 256 28s 0.11

It shows that batching can only increase the speed slightly. As you say, batching makes tiny effort on real-time inference, but only increases the speed on training and validation on specified dataset.

zhaoxuyan

comment created time in 3 hours

issue commentcocodataset/cocoapi

AssertionError: results in not an array of object

Same here! Has anyone solved this? Thank you!!

VincentDuf

comment created time in 3 hours

issue commentfacebookresearch/detectron2

Do you support batch inference?

@riokaa You mean with batch size 8, it's 28s per frame and not 28s for the entire batch right? (By the way, are you on CPU ad not GPU, because 28 is super slow!) We did some more experiments. Looks like gains from batching comes only at higher batch sizes. See this link: https://github.com/apache/incubator-mxnet/issues/6396 So for real-time inference, if you have to treat frames as fast as they come (say 5 or 10 fps), and it's not really possible to wait and make a big batch, batching doesn't really help. This is my conclusion so far.

zhaoxuyan

comment created time in 4 hours

issue commentfacebookresearch/detectron2

Register custom dataset -AssertionError

@RishiMalhotra920 @rogertrullo @ppwwyyxx You could find these 3 methods helpful: DatasetCatalog.list() - lists all registered dataset instances. DatasetCatalog.get('coco_instance_name') DatasetCatalog.remove('coco_instance_name')

You could use something like this to remove and re-register:

dataset_name = 'coco_dataset'

if dataset_name in DatasetCatalog.list():
    DatasetCatalog.remove(dataset_name)

register_coco_instances(dataset_name, ...)
andresviana

comment created time in 4 hours

issue commentfacebookresearch/detectron2

Example for rotated faster rcnn

I have not seen the update projects/RotatedFasterRCNN, and I desire to joint it; What should I do?

cjt222

comment created time in 5 hours

startedrbgirshick/yacs

started time in 6 hours

issue commentfacebookresearch/detectron2

I'm using panoptic_fpn_R_101_3x model. I have category_id = 40. How i can get contours only for category_id = 40? (it's sky)

if i'm try to print np.array(outputs["instances"].pred_masks), i've recive array of False

[False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False] [False False False ... False False False]

ScantyDaemon

comment created time in 8 hours

issue commentfacebookresearch/detectron2

Can I run detectron2 with a Rrtx30 series GPU

@ppwwyyxx Did you try the preview version of pytorch 1.7 with CUDA11.1 or CUDA11.0? are they fixed?

kyliao426

comment created time in 9 hours

issue commentfacebookresearch/detectron2

Can I run detectron2 with a Rrtx30 series GPU

In pytorch 1.7 the above bug was not fixed.

So if i want to run detectron2 on rtx 3070, can it be achieved? or i should wait for the bug to be fixed.

kyliao426

comment created time in 9 hours

issue commentfacebookresearch/detectron2

Can I run detectron2 with a Rrtx30 series GPU

In pytorch 1.7 the above bug was not fixed.

kyliao426

comment created time in 9 hours

issue commentfacebookresearch/detectron2

Can I run detectron2 with a Rrtx30 series GPU

i successfully install pytorch 1.7 + cuda 11.1 on my rtx 3070, It can run normally, but detectron2 still can't run.It looks like a compatibility issue.

kyliao426

comment created time in 9 hours

fork Suryash89/py-faster-rcnn

Faster R-CNN (Python implementation) -- see https://github.com/ShaoqingRen/faster_rcnn for the official MATLAB version

fork in 9 hours

startedrbgirshick/py-faster-rcnn

started time in 9 hours

issue closedfacebookresearch/detectron2

I'm using panoptic_fpn_R_101_3x model. I have category_id = 40. How i can get contours only for category_id = 40? (it's sky)

Example of my code:

cfg = get_cfg()
cfg.MODEL.DEVICE=("cpu")
cfg.merge_from_file("../configs/COCO-PanopticSegmentation/panoptic_fpn_R_101_3x.yaml")
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.5
cfg.MODEL.WEIGHTS = "detectron2://COCO-PanopticSegmentation/panoptic_fpn_R_101_3x/139514519/model_final_cafdb1.pkl"
predictor = DefaultPredictor(cfg)

im = cv2.imread("sky.png")
outputs = predictor(I'm)
class_names = MetadataCatalog.get(cfg.DATASETS.TRAIN[0]).stuff_classes
a=np.array(outputs["panoptic_seg"][1])
for x in a:
        thinglasser=x['isthing']
        if thinglasser!=True: 
            print(class_names[x['category_id']])
            if (class_names[x['category_id']])=='sky':
                  ##IMSHOW CONTOURS OF SKY##


closed time in 10 hours

ScantyDaemon

issue commentfacebookresearch/detectron2

I'm using panoptic_fpn_R_101_3x model. I have category_id = 40. How i can get contours only for category_id = 40? (it's sky)

The output format is documented in https://detectron2.readthedocs.io/tutorials/models.html#model-output-format

ScantyDaemon

comment created time in 10 hours

issue closedfacebookresearch/detectron2

detectron2 custom data evaluation using CocoEvaluator

Hi there,

I am trying to train the model on custom dataset, I have registered both train and test data to dataset catalog, when I try to use CocoEvaluator on custom data, the code is failing saying as below,

image

Below code refers to how I have registered the dataset,

def get_images_dicts(csv_file, img_dir):
    df = pd.read_csv(csv_file)
    df['filename'] = df['filename'].map(lambda x: img_dir+x)

    classes = ['0']

    df['class_int'] = df['class'].map(lambda x: classes.index(str(x)))
    
    dataset_dicts = []
    for filename in df['filename'].unique().tolist():
        record = {}
        
        height, width = np.array(Image.open(filename)).shape[:2]
        
        record["file_name"] = filename
        record["height"] = height
        record["width"] = width

        objs = []
        for index, row in df[(df['filename']==filename)].iterrows():
          obj= {
              'bbox': [row['xmin'], row['ymin'], row['xmax'], row['ymax']],
              'bbox_mode': BoxMode.XYXY_ABS,
              'category_id': row['class_int'],
              "iscrowd": 0
          }
          objs.append(obj)
        record["annotations"] = objs
        dataset_dicts.append(record)
    return dataset_dicts

classes = ['0']
for d in ["train", "test"]:
    print("input/" + d + '_labels.csv', "input/" + d + '/')
    DatasetCatalog.register("input/" + d, lambda d=d: get_images_dicts("input/" + d + '_labels.csv', "input/" + d + '/'))
    MetadataCatalog.get("input/" + d).set(thing_classes=classes)
class CocoTrainer(DefaultTrainer):

  @classmethod
  def build_evaluator(cls, cfg, dataset_name, output_folder=None):

    if output_folder is None:
        os.makedirs("coco_eval", exist_ok=True)
        output_folder = "coco_eval"

    return COCOEvaluator(dataset_name, cfg, False, output_folder)
cfg.DATASETS.TRAIN = ("input/train",)
cfg.DATASETS.TEST = ("input/test",)
.
.
trainer = CocoTrainer(cfg)

Thanks in advance Regards, Krishna

closed time in 10 hours

krishnamoorthybabu

issue commentfacebookresearch/detectron2

detectron2 custom data evaluation using CocoEvaluator

Please follow https://detectron2.readthedocs.io/tutorials/datasets.html#standard-dataset-dicts and include image_id.

krishnamoorthybabu

comment created time in 10 hours

issue closedfacebookresearch/detectron2

Can I run detectron2 with a Rrtx30 series GPU

as title, I want to upgrade my device recently. Is detectron2 already support in Rtx 30 series GPU? If yes, how can I do that?

thanks

closed time in 10 hours

kyliao426

issue commentfacebookresearch/detectron2

Can I run detectron2 with a Rrtx30 series GPU

It can build if a pytorch bug is fixed: https://github.com/pytorch/pytorch/pull/47585

kyliao426

comment created time in 10 hours

startedrbgirshick/py-faster-rcnn

started time in 10 hours

more