首页|基于ISAR图像序列的目标三维重构技术

基于ISAR图像序列的目标三维重构技术

扫码查看
从目标逆合成孔径雷达(ISAR)图像序列中获得目标的三维位置及结构信息,对目标识别与解译、空间目标监视等技术十分重要;通过对散射点特征提取与匹配及三维重构算法的研究,实现基于ISAR图像序列对目标的三维重构;首先利用LoFTR特征匹配算法提取目标强散射点关联位置信息,通过粗匹配与精匹配获得特征点二维坐标并生成观测矩阵;然后利用正交因式分解法计算目标强散射点三维位置信息,并通过重构结果融合实现较好重构效果;通过对序列ALOS卫星图像的处理,得到了目标强散射点的三维位置及形状的重构结果;结果表明,该方法能够有效地从ISAR二维图像信息中获得目标的三维空间信息。
Target 3D Reconstruction Technique Based on ISAR Image Sequence
It is crucial for technologies such as target recognition and interpretation,as well as space target monitoring to obtain three-dimensional information of targets from inverse synthetic aperture radar(ISAR)image sequences.This paper researches the feature extraction,matching and 3D reconstruction technique for the scattering points,achieves the 3D reconstruction for targets based on the ISAR image sequences.Firstly,the feature matching algorithm for local feature based transformer(LoFTR)is used to extract the associated position information of the target strong scattering points.The horizontal and vertical coordinate information of the target is obtained through the coarse matching and fine matching,which generates an observation matrix,Then,the three-dimen-sional coordinate information of the target's strong scattering points is obtained through the orthogonal decomposition,and the shape and structure of the target are reconstructed,achieving a good reconstruction result through the fusion of reconstruction results.Fi-nally,the three-dimensional position and shape reconstruction results of the target's strong scattering points are obtained by processing the continuous ALOS satellite images.The reconstruction results show that this method can effectively reconstruct the three-dimen-sional position and shape information of the target from two-dimensional multi-angle ISAR images.

ISARLoFTRfeature matchingorthogonal decomposition3D reconstruction

李敏敏、杨利红、吴超

展开 >

西安工业大学光电工程学院,西安 710021

ISAR LoFTR 特征匹配 因式分解法 三维重构

国家自然科学基金国家自然科学基金

6207135962001364

2024

计算机测量与控制
中国计算机自动测量与控制技术协会

计算机测量与控制

CSTPCD
影响因子:0.546
ISSN:1671-4598
年,卷(期):2024.32(8)