A new version of ResearchHub is available.Try it now
Paper
Document
Download
Flag content
0

Object detection on low-resolution images with two-stage enhancement

Save
TipTip
Document
Download
Flag content
0
TipTip
Save
Document
Download
Flag content

Abstract

Although deep learning-based object detection methods have achieved superior performance on conventional benchmark datasets, it is still difficult to detect objects from low-resolution (LR) images under diverse degradation conditions. To this end, a two-stage enhancement method for the LR image object detection (TELOD) framework is proposed. In the first stage, an extremely lightweight task disentanglement enhancement network (TDEN) is developed as a super-resolution (SR) sub-network before the detector. In the TDEN, the SR images can be obtained by applying the recurrent connection manner between an image restoration branch (IRB) and a resolution enhancement branch (REB) to enhance the input LR images. Specifically, the TDEN reduces the difficulty of image reconstruction by dividing the total image enhancement task into two sub-tasks, which are accomplished by the IRB and REB, respectively. Furthermore, a shared feature extractor is applied across two sub-tasks to explore common and accurate feature representations. In the second stage, an auxiliary feature enhancement head (AFEH) driven by high-resolution (HR) image priors is designed to improve the task-specific features produced by the detection Neck without any extra inference costs. In particular, the feature interaction module is built into the AFEH to integrate the features from the enhancement and detection phases to learn comprehensive information for detection. Extensive experiments show that the proposed TELOD significantly outperforms other methods. Specifically, the TELOD achieves mAP improvements of 1.8% and 3.3% over the second best method AERIS on degraded VOC and COCO datasets, respectively.

Paper PDF

This paper's license is marked as closed access or non-commercial and cannot be viewed on ResearchHub. Visit the paper's external site.