中国科学院光电技术研究所机构知识库
Advanced  
IOE OpenIR  > 光电探测与信号处理研究室(五室)  > 期刊论文
题名:
Efficient object detection based on selective attention
作者: Yu, Huapeng1,2,3; Chang, Yongxin1,2,3; Lu, Pei1,3; Xu, Zhiyong1; Fu, Chengyu1; Wang, Yafei2
刊名: Computers and Electrical Engineering
出版日期: 2014
卷号: 40, 期号:3, 页码:907-919
学科分类: Information filtering - Object recognition
DOI: 10.1016/j.compeleceng.2013.09.002
通讯作者: Yu, H. (yuhuapeng@uestc.edu.cn)
文章类型: 期刊论文
中文摘要: In this paper, we make use of biologically inspired selective attention to improve the efficiency and performance of object detection under clutter. At first, we propose a novel bottom-up attention model. We argue that heuristic feature selection based on bottom-up attention can stably select out invariant and discriminative features. With these selected features, performance of object detection can be improved apparently and stably. Then we propose a novel concept of saccade map based on bottom-up attention to simulate the saccade (eye movements) in vision. Sliding within saccade map to detect object can significantly reduce computational complexity and apparently improve performance because of the effective filtering for distracting information. With these ideas, we present a general framework for object detection through integrating bottom-up attention. Through evaluating on UIUC cars and Weizmann-Shotton horses we show state-of-the-art performance of our object detection model. © 2013 Elsevier Ltd. All rights reserved.
英文摘要: In this paper, we make use of biologically inspired selective attention to improve the efficiency and performance of object detection under clutter. At first, we propose a novel bottom-up attention model. We argue that heuristic feature selection based on bottom-up attention can stably select out invariant and discriminative features. With these selected features, performance of object detection can be improved apparently and stably. Then we propose a novel concept of saccade map based on bottom-up attention to simulate the saccade (eye movements) in vision. Sliding within saccade map to detect object can significantly reduce computational complexity and apparently improve performance because of the effective filtering for distracting information. With these ideas, we present a general framework for object detection through integrating bottom-up attention. Through evaluating on UIUC cars and Weizmann-Shotton horses we show state-of-the-art performance of our object detection model. © 2013 Elsevier Ltd. All rights reserved.
收录类别: SCI ; Ei
语种: 英语
WOS记录号: WOS:000336187000014
ISSN号: 00457906
Citation statistics:
内容类型: 期刊论文
URI标识: http://ir.ioe.ac.cn/handle/181551/5073
Appears in Collections:光电探测与信号处理研究室(五室)_期刊论文

Files in This Item:
File Name/ File Size Content Type Version Access License
2014-2022.pdf(2059KB)期刊论文作者接受稿开放获取View 联系获取全文

作者单位: 1. 5th Lab, Institute of Optics and Electronics, Chinese Academy of Sciences, P.O. Box 350, Shuangliu, Chengdu 610209, Sichuan Province, China
2. School of Optoelectronic Information, University of Electronic Science and Technology of China, Chengdu 610054, China
3. Graduate University of Chinese Academy of Sciences, Beijing 100039, China

Recommended Citation:
Yu, Huapeng,Chang, Yongxin,Lu, Pei,et al. Efficient object detection based on selective attention[J]. Computers and Electrical Engineering,2014,40(3):907-919.
Service
Recommend this item
Sava as my favorate item
Show this item's statistics
Export Endnote File
Google Scholar
Similar articles in Google Scholar
[Yu, Huapeng]'s Articles
[Chang, Yongxin]'s Articles
[Lu, Pei]'s Articles
CSDL cross search
Similar articles in CSDL Cross Search
[Yu, Huapeng]‘s Articles
[Chang, Yongxin]‘s Articles
[Lu, Pei]‘s Articles
Related Copyright Policies
Null
Social Bookmarking
Add to CiteULike Add to Connotea Add to Del.icio.us Add to Digg Add to Reddit
文件名: 2014-2022.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 
评注功能仅针对注册用户开放,请您登录
您对该条目有什么异议,请填写以下表单,管理员会尽快联系您。
内 容:
Email:  *
单位:
验证码:   刷新
您在IR的使用过程中有什么好的想法或者建议可以反馈给我们。
标 题:
 *
内 容:
Email:  *
验证码:   刷新

Items in IR are protected by copyright, with all rights reserved, unless otherwise indicated.

 

 

Valid XHTML 1.0!
Copyright © 2007-2016  中国科学院光电技术研究所 - Feedback
Powered by CSpace