首页|面向360度全景图像显著目标检测的相邻协调网络

面向360度全景图像显著目标检测的相邻协调网络

扫码查看
为解决360°全景图像显著目标检测(SOD)中的显著目标尺度变化和边缘不连续、易模糊的问题,该文提出一种基于相邻协调网络的360°全景图像显著目标检测方法(ACoNet).首先,利用相邻细节融合模块获取相邻特征中的细节和边缘信息,以促进显著目标的精确定位.其次,使用语义引导特征聚合模块来聚合浅层特征和深层特征之间不同尺度上的语义特征信息,并抑制浅层特征传递的噪声,缓解解码阶段显著目标与背景区域不连续、边界易模糊的问题.同时构建多尺度语义融合子模块扩大不同卷积层的多尺度感受野,实现精确训练显著目标边界的效果.在2个公开的数据集上进行的大量实验结果表明,相比于其他13种先进方法,所提方法在6个客观评价指标上均有明显的提升,同时主观可视化检测的显著图边缘轮廓性更好,空间结构细节信息更清晰.
Adjacent Coordination Network for Salient Object Detection in 360 Degree Omnidirectional Images
To address the issues of significant target scale variation,edge discontinuity,and blurring in 360° omnidirectional images Salient Object Detection(SOD),a method based on the Adjacent Coordination Network(ACoNet)is proposed.First,an adjacent detail fusion module is used to capture detailed and edge information from adjacent features,which facilitates accurate localization of salient objects.Then,a semantic-guided feature aggregation module is employed to aggregate semantic feature information from different scales between shallow and deep features,suppressing the noise transmitted by shallow features.This helps alleviate the problem of discontinuous salient objects and blurred boundaries between the object and background in the decoding stage.Additionally,a multi-scale semantic fusion submodule is constructed to enlarge the receptive field across different convolution layers,thereby achieving better training of the salient object boundaries.Extensive experimental results on two public datasets demonstrate that,compared to 13 other advanced methods,the proposed approach achieves significant improvements in six objective evaluation metrics.Moreover,the subjective visualized detection results show better edge contours and clearer spatial structural details of the salient maps.

Salient Object Detection(SOD)Deep learning360° omnidirectional imagesMulti-scale features

陈晓雷、王兴、张学功、杜泽龙

展开 >

兰州理工大学电气工程与信息工程学院 兰州 730050

显著目标检测 深度学习 360°全景图像 多尺度特征

2024

电子与信息学报
中国科学院电子学研究所 国家自然科学基金委员会信息科学部

电子与信息学报

CSTPCD北大核心
影响因子:1.302
ISSN:1009-5896
年,卷(期):2024.46(12)