Wei Ke, Jie Chen, Qixiang Ye, Deep contour and symmetry scored object proposal, Pattern Recognition Letters, Volume 119, 2019, Pages 172-179, ISSN 0167-8655, https://doi.org/10.1016/j.patrec.2018.01.004
Deep contour and symmetry scored object proposal
|Author:||Ke, Wei1,2; Chen, Jie2; Ye, Qixiang1|
1School of Electronic, Electrical and Communication Engineering, University of Chinese of Academy of Sciences, Beijing, 101408, China
2Center for Machine Vision and Signal Analysis, University of Oulu, Oulu, 90570, Finland
|Persistent link:|| http://urn.fi/urn:nbn:fi-fe2019052316707
|Publish Date:|| 2021-03-01
Object proposal has been successfully applied in recent supervised and weakly supervised visual object detection tasks to improve the computational efficiency. The classical grouping-based object proposal approach can produce region proposals with high localization accuracy, but incorporates significant redundancy for the lack of object confidence to evaluate the proposals. In this paper, we propose leveraging the essential properties of images, i.e., contour and symmetry, to score the redundant region proposals. Specifically, the contour and symmetry are extracted by a Simultaneous Contour and Symmetry Detection Network (SCSDN) and used to score the bounding box with a Bayesian framework, which guarantees that the scoring procedure is adaptive to general objects. A subset of high-scored proposals reserves the recall rate, while can also significantly decrease the redundancy. Experimental results show that the proposed approach improves the baseline by increasing the recall rate from 0.87 to 0.89 on the PASCAL VOC 2007 dataset. It also outperforms the state-of-the-art on AUC and uses much fewer object proposals to achieve comparable recall rate.
Pattern recognition letters
|Pages:||172 - 179|
|Type of Publication:||
A1 Journal article – refereed
|Field of Science:||
113 Computer and information sciences
This work was supported in part by the National Science Foundation of China under Grant 61671427, and Beijing Municipal Science and Technology Commission under Grant Z161100001616005.
© 2018 Published by Elsevier B.V. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/.