首页|基于深度学习无人机影像道路实景三维修复

基于深度学习无人机影像道路实景三维修复

扫码查看
针对基于无人机倾斜影像实景三维重建中,移动目标对道路实景三维重建造成几何变形和纹理失真的问题,提出一种基于深度学习的无人机影像道路实景三维修复方法。首先,通过添加注意力机制的深度学习网络模型YOLOv8对影像中目标进行检测;其次,在得到影像对应目标标记的基础上,根据已生成三维Mesh模型中各三角面在可视影像集合中的投影位置,结合影像所标记目标的对应范围,统计各三角面的类别信息以此判定移动目标;最后,利用移动目标判定结果对移动目标造成的三维模型几何变形及纹理错误进行修复,实现道路实景三维重建。结果表明:改进的网络模型较YOLOv4、YOLOv5和YOLOv8模型,平均精度(mAP)值平均提升10。82%,移动目标判定准确率达97。43%。与流行国外商业软件相比,所提方法重建修复效果更佳,自动化程度更高。
3D restoration of road real scene based on deep learning for UAV images
In order to address the problems of geometric deformation and texture distortion caused by moving targets on the road 3D Real Scene model based on UAV oblique photography,a restoration method of the road 3D Real Scene model is proposed using deep learning.Firstly,the YOLOv8 net-work,involving the attention mechanism,was employed to detect objects in the image.Secondly,based on the detected object's range,the presence of moving targets was determined by analyzing the catego-ry information of each triangular face in the mesh,according to its projection position in the visual im-age set.Finally,the geometric deformation and texture distortion of the 3D model were restored from the results of moving targets determination to achieve the road 3D Real Scene model reconstruction.The re-sults indicate that the improved network enhances the mAP by an average of 10.82%over YOLOv4,YOLOv5 and YOLOv8.Furthermore,the accuracy of moving targets determination is 97.43%.Addi-tionally,in contrast to commercial software,the proposed method demonstrates a superior restoration effect and a higher level of automation.

reality 3D reconstruction of real scenedeep learningMesh modelocclusion cullingtex-ture restoration

蒋萧、邱春霞、张春森、郭丙轩、帅林宏、彭哲、贾欣

展开 >

西安科技大学测绘科学与技术学院,陕西西安 710054

武汉大学测绘遥感信息工程国家重点实验室,湖北武汉 430079

武汉讯图时空软件科技有限公司,湖北武汉 430223

实景三维重建 深度学习 Mesh模型 遮挡剔除 纹理修复

国家自然科学基金项目自然资源部城市国土资源监测与仿真重点实验室开放基金项目武汉大学-华为空间信息技术创新实验室资助项目长江水利委员会长江科学院开放研究基金项目

92038301KF-2022-07-003K22-4201-011CK-WV20231167/KY

2024

西安科技大学学报
西安科技大学

西安科技大学学报

CSTPCD北大核心
影响因子:1.154
ISSN:1672-9315
年,卷(期):2024.44(1)
  • 30