武汉大学学报(工学版)2024,Vol.57Issue(8) :1150-1159.DOI:10.14188/j.1671-8844.2024-08-015

事件相机无监督视频重建

Unsupervised video reconstruction based on event camera

刘凢 余磊
武汉大学学报(工学版)2024,Vol.57Issue(8) :1150-1159.DOI:10.14188/j.1671-8844.2024-08-015

事件相机无监督视频重建

Unsupervised video reconstruction based on event camera

刘凢 1余磊1
扫码查看

作者信息

  • 1. 武汉大学电子信息学院,湖北武汉 430072
  • 折叠

摘要

实际应用中,事件时钟与传统相机图像帧率之间的差异往往导致二者存在一定的时空差异,很难得到精准的一一对应的事件-图像数据对,导致无法对网络进行有效的监督训练.针对时空匹配困难的问题,利用循环生成对抗网络中学习图片整体分布的特性,提出基于事件相机的无监督视频重建方法,实现了在事件相机上非时空匹配的无监督重建.实验结果表明,相比于现有的基于事件相机的视频重建方法,所提方法在结构相似性(structural similarity,SSIM)、均方误差(mean-square error,MSE)和盲/无参考图像空间质量评估器(blind/referenceless image spatial quality evaluator,BRISQUE)3个指标上均有提升;在没有时空匹配数据的情况下,所提方法能够重建出较为清晰的高帧率视频.

Abstract

In practical applications,the difference between event clock and traditional camera frame rate often leads to a certain spastiotemporal difference between the two,and it is difficult to obtain an accurate one-to-one corresponding event-image data pair,resulting in the failure of effective supervision and network training.Aiming at the difficulty of spatiotemporal matching,an unsupervised video reconstruction method based on event camera is proposed by using the characteristics of the overall distribution of images learned in the cyclic generative adversarial networks,which realizes unsupervised reconstruction of non-spatiotemporal matching based on event camera.The experimental results show that compared with the existing video reconstruction methods based on event camera,the proposed method has improved in three indicators:structural similarity(SSIM),mean square error(MSE)and blind/referenceless image spatial quality evaluator(BRISQUE).In the absence of spatiotemporal matching data,the proposed method can reconstruct a relatively clear and high frame rate video.

关键词

事件相机/图像重建/深度学习/循环生成对抗网络

Key words

event camera/video reconstruction/deep learning/CycleGAN

引用本文复制引用

基金项目

国家自然科学基金(62271354)

国家自然科学基金(61871297)

湖北省自然科学基金(2021CFB467)

出版年

2024
武汉大学学报(工学版)
武汉大学

武汉大学学报(工学版)

CSTPCDCSCD北大核心
影响因子:0.621
ISSN:1671-8844
参考文献量23
段落导航相关论文