首页|Shading aware DSM generation from high resolution multi-view satellite images

Shading aware DSM generation from high resolution multi-view satellite images

扫码查看
In many cases,the Digital Surface Models(DSMs)and Digital Elevation Models(DEMs)are obtained with Light Detection and Ranging(LiDAR)or stereo matching.As an active method,LiDAR is very accurate but expensive,thus often limiting its use in small-scale acquisition.Stereo matching is suitable for large-scale acquisition of terrain information as the increase of satellite stereo sensors.However,underperformance of stereo matching easily occurs in textureless areas.Accordingly,this study proposed a Shading Aware DSM GEneration Method(SADGE)with high resolution multi-view satellite images.Considering the complementarity of stereo matching and Shape from Shading(SfS),SADGE combines the advantage of stereo matching and SfS technique.First,an improved Semi-Global Matching(SGM)technique is used to generate an initial surface expressed by a DSM;then,it is refined by optimizing the objective function which modeled the imaging process with the illumination,surface albedo,and normal object surface.Different from the existing shading-based DEM refinement or generation method,no information about the illumination or the viewing angle is needed while concave/convex ambiguity can be avoided as multi-view images are utilized.Experiments with ZiYuan-3 and GaoFen-7 images show that the proposed method can generate higher accuracy DSM(12.5-56.3%improvement)with sound overall shape and temporarily detailed surface compared with a software solution(SURE)for multi-view stereo.

Shape from Shading(SfS)multi-view stereoDigital Surface Model(DSM)high resolution multi-view satellite images

Zhihua Hu、Pengjie Tao、Xiaoxiang Long、Haiyan Wang

展开 >

School of Remote Sensing and Information Engineering,Wuhan University,Wuhan,China

China Centre for Resources Satellite Data and Application,Beijing,China

2024

地球空间信息科学学报(英文版)
武汉大学(原武汉测绘科技大学)

地球空间信息科学学报(英文版)

影响因子:0.207
ISSN:1009-5020
年,卷(期):2024.27(2)