首页|智能网联汽车感知协作共享数据研究

智能网联汽车感知协作共享数据研究

扫码查看
在智能网联汽车感知协作时,传统方案是将单车生成的障碍物列表进行融合,但这种方式只能对已存在于各车障碍物列表的物体进行融合,无法对单车漏检的障碍物进行感知.为了提高智能网联汽车的感知协作能力,本文提出一种多车间共享几何地图的生成与融合方法,该方法能够在激光雷达进行目标检测的过程中提取原始3D点云中每个Pillar区域内的高度特征,经过压缩处理后得到2.5D几何地图,与目标检测算法结果组合生成共享数据发送至通信范围内的其余车辆,实现对未知障碍物高度信息的补充,扩大单车感知范围,降低漏检带来的影响.实验表明,共享数据融合后车辆的目标检测精度最高可达85.71%,较单车感知提高了 14.28%,且在4G通信条件下,共享数据传输时延平均为5.23 ms.
Perceiving Cooperatively Shared Data for Connected Autonomous Vehicle
In order toperceive cooperativelya connected autonomous vehicle(CAV),the traditional method is to fuse its obstacle list,but it can only fuse the objects that already exist in the obstacle list and cannot fuse the obstacles that are undetected by a bicycle.In order to improve the perception and cooperation ability of the CAV,this paper proposes a multi-vehicle shared geometric map generation and fusion method,which can extract the original height feature of the CAVfrom the 3D point cloud in each Pillar area during target detection with a laser radar.The height feature is compressed to obtain a 2.5D geometric map,which is combined with the results of the target detection algorithm to generate shared data and sent to other vehicles within the communication range to supplement the height information onan unknown obstacle,expand the perception range of the bicycle and reduce undetected zones.Experimental results show that the vehicletarget detection accuracy after shared data fusionreaches up to 85.71%,which is 14.28%higher than that of single-vehicle perception.Under the condition of 4G communication,the average transmission delay of shared data is 5.23 ms.

connected autonomous vehiclescooperative perceptionshared datapillar2.5D geometric map

王宏多、王健、陈启、郭欣宇、蒋品、陈腾云

展开 >

吉林大学计算机科学与技术学院,长春 130012

华为技术有限公司Wireless X Labs,上海 200120

智能网联汽车 感知协作 共享数据 Pillar 2.5D几何地图

2024

机械科学与技术
西北工业大学

机械科学与技术

CSTPCD北大核心
影响因子:0.565
ISSN:1003-8728
年,卷(期):2024.43(12)