首页|基于单目视觉的无水尺水位测量方法

基于单目视觉的无水尺水位测量方法

扫码查看
水位是水文监测的关键要素,其精准测量对洪涝灾害防御和水量计量具有重要的意义。随着智慧水利建设和视频设备的大规模部署,基于图像处理的水位识别方法发展迅速,是目前水利量测领域的前沿方向。本文提出了一种基于单目视觉的无水尺水位测量方法。该方法采用深度学习构建水面分割模型,自动从水岸图像中提取水位线;再根据相机标定得到的空间映射关系,结合断面约束,计算水位线像素坐标对应的三维空间坐标,进而处理得到水位值。该方法应用于室内河工模型水槽实验,水面分割准确,水位线平均错误分割像素个数为0。825,计算水位值的平均绝对误差约为1。5 mm,均方根误差约为1。9 mm。实验结果表明该方法准确测量了水位的变化过程。
Monocular vision-based gaugeless water level measurement
Water level is a key element of hydrological measurement.Accurate water level measurement is of great significance for flood disaster prevention and water metering.With the construction of intelligent hydraulic engineering and the large-scale deployment of video equipment,the water level recognition methods based on image processing have been processed rapidly,which is currently cutting-edge research interest in the field of water level measurement.This article proposes a monocular vision-based gaugeless water level measurement method.Firstly,deep learning techniques are used to formulate a water surface segmentation model enabling automated waterline detection from water edge images.Subsequently,utilizing spatial mapping derived from camera calibration and sectional constraints,3D coordinates corresponding to waterline pixels are computed.Finally,statistical methods are applied to compute the water level.The method is applied to an indoor flume experiment to validate its accuracy.The average number of falsely segmented pixels on the water line is 0.825,which shows that the water surface segmentation is accurate.The mean absolute error and root mean square error are 1.5 mm and 1.9 mm,respectively.The results show that the method can accurately measure the variation process of water level.

water level measurementmonocular visionwater line detectiondeep learningcamera calibration

刘子奇、李丹勋、朱德军、曹列凯

展开 >

清华大学水利水电工程系 北京 100084

清华大学水圈科学与水利工程全国重点实验室 北京 100084

华北电力大学水利与水电工程学院 北京 102206

水位测量 单目视觉 水位线提取 深度学习 相机标定

国家重点研发计划课题项目国家自然科学基金项目湖南省水利科技项目

2022YFC3201803U2243240XSKJ2023059-05

2024

仪器仪表学报
中国仪器仪表学会

仪器仪表学报

CSTPCD北大核心
影响因子:2.372
ISSN:0254-3087
年,卷(期):2024.45(7)