Skip to content

[CVPR2023] This is an official mmdet implementation of paper "DETRs with Hybrid Matching".

License

Notifications You must be signed in to change notification settings

HDETR/H-Deformable-DETR-mmdet

Repository files navigation

H-Deformable-DETR for MMDet

This is the official implementation of the paper "DETRs with Hybrid Matching".

Authors: Ding Jia, Yuhui Yuan, Haodi He, Xiaopei Wu, Haojun Yu, Weihong Lin, Lei Sun, Chao Zhang, Han Hu

Model ZOO

🍺🍺🍺Please checkout the branch mmdetection-with-plug-in for more clean orgnizations.📣📣📣

We provide a set of baseline results and trained models available for download:

Models with the ResNet-50 backbone

Name Backbone query epochs AP in Paper AP download
Deformable-DETR R50 300 12 43.7 43.7 model
Deformable-DETR R50 300 36 46.8 46.7 model
Deformable-DETR + tricks R50 300 12 47.0 46.9 model
Deformable-DETR + tricks R50 300 36 49.0 49.0 model
H-Deformable-DETR + tricks R50 300 12 48.7 48.5 model
H-Deformable-DETR + tricks R50 300 36 50.0 49.9 model

Notice that to align with the offical implement of Deformable DETR and other backbones, we do not freeze the stage 1-th in the ResNet-50 backbone.

Models with Swin Transformer backbones

Name Backbone query epochs AP in Paper AP download
Deformable-DETR Swin Tiny 300 12 45.3, 46.0 46.1 model
Deformable-DETR Swin Tiny 300 36 49.0, 49.6 49.6 model
Deformable-DETR + tricks Swin Tiny 300 12 49.3 49.3 model
Deformable-DETR + tricks Swin Tiny 300 36 51.8 52.1 model
H-Deformable-DETR + tricks Swin Tiny 300 12 50.6 50.8 model
H-Deformable-DETR + tricks Swin Tiny 300 36 53.2 53.3 model

Installation

We test our models under python=3.7.10,pytorch=1.10.1,cuda=10.2. Other versions might be available as well.

  1. Clone this repo
git https://github.com/HDETR/H-Deformable-DETR-mmdet.git
cd H-Deformable-DETR-mmdet
  1. Install Pytorch and torchvision

Follow the instruction on https://pytorch.org/get-started/locally/.

# an example:
conda install -c pytorch pytorch torchvision
  1. Install other needed packages
pip install -r requirements.txt
pip install openmim
mim install mmcv-full
pip install -e .

Data

Please download COCO 2017 dataset and organize them as following:

H-Deformable-DETR-mmdet
├── data
│   ├── coco
│   │   ├── train2017
│   │   ├── val2017
│   │   └── annotations
|   |        ├── instances_train2017.json
|   |        └── instances_val2017.json

Run

To train a model using 8 cards

GPUS_PER_NODE=8  ./tools/dist_train.sh \
    <config path> \
    8

To train/eval a model with the swin transformer backbone, you need to download the backbone from the offical repo frist and specify argumentcheckpoint like our config.

To eval a model using 8 cards

GPUS_PER_NODE=8 tools/dist_test.sh \
    <config path> \
    <checkpoint path> \
    8 --eval bbox

Modified files compared to original MMDet

  • configs/deformable_detr: add baseline configs
  • configs/h-deformable-detr: add h-deformable-detr configs
  • mmdet/models/utils/transformer.py: enable tricks and decoder_self_attn_mask
  • mmdet/models/dense_heads/hybrid_branch_deformable_detr_head.py: enable hybrid branch strategy and tricks
  • mmdet/models/dense_heads/deformable_detr_head.py: enable tricks

Citing H-Deformable-DETR for MMDet

If you find H-Deformable-DETR for MMDet useful in your research, please consider citing:

@article{jia2022detrs,
  title={DETRs with Hybrid Matching},
  author={Jia, Ding and Yuan, Yuhui and He, Haodi and Wu, Xiaopei and Yu, Haojun and Lin, Weihong and Sun, Lei and Zhang, Chao and Hu, Han},
  journal={arXiv preprint arXiv:2207.13080},
  year={2022}
}

@article{zhu2020deformable,
  title={Deformable detr: Deformable transformers for end-to-end object detection},
  author={Zhu, Xizhou and Su, Weijie and Lu, Lewei and Li, Bin and Wang, Xiaogang and Dai, Jifeng},
  journal={arXiv preprint arXiv:2010.04159},
  year={2020}
}

About

[CVPR2023] This is an official mmdet implementation of paper "DETRs with Hybrid Matching".

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages