首页|基于加权损失的点云占用图视频上采样

基于加权损失的点云占用图视频上采样

扫码查看
基于视频的点云压缩标准(Video-based Point Cloud Compression,V-PCC)中,3D点云会被分成数百个块并投影到2D平面中,形成记录点云纹理信息的纹理视频和记录点云空间信息的几何视频.同时,还需要生成一个占用图视频(Occupancy Map Video),以记录纹理视频和几何视频中每一个像素点是否对应重建点云中的某个点.因此,占用图视频质量与重建点云质量直接相关.为了节约编码比特数,占用图视频在编码端会先被下采样,然后在解码端通过简单的上采样恢复到原分辨率.文中的基本思路是引入深度学习来代替V-PCC中的简单上采样方法,使得上采样后的占用图视频质量更高,从而提高点云的重建质量.在网络训练阶段提出使用加权损失函数,使得在重建点云时能尽可能少地移除正常点并尽可能多地移除噪声点.实验结果证明,所提方法可以大幅提升V-PCC的主客观性能.
Weighted-loss-based Up-sampling for Point Cloud Occupancy Map Video
In video-based point cloud compression(V-PCC),a 3D point cloud is divided into hundreds of patches and then mapped onto a 2D grid,generating a texture video that captures texture information and a geometry video that captures geometry informa-tion.Meanwhile,an occupancy map video is also generated to record whether each pixel in the former two videos corresponds to a point in the reconstructed point cloud.Therefore,the quality of the occupancy map video is directly linked to the quality of the re-constructed point cloud.To save bit cost,the occupancy map video is down-sampled at the encoder and up-sampled with a simplis-tic method at the decoder.This paper aims to use a deep learning-based up-sampling method to replace the simple up-sampling method in the original V-PCC to improve the quality of the up-sampled occupancy map videos as well as that of the reconstructed point cloud.A weighted distortion loss function in the network training process is introduced to remove the normal points as few as possible while removing the noisy points as many as possible when reconstructing a point cloud.Experimental results show that the proposed method can significantly improve the subjective and objective performances of the V-PCC.

Point cloud compressionVideo-based point cloud compression standardOccupancy map videoVideo up-samplingWeighted loss

陈航、李礼、刘东、李厚强

展开 >

中国科学技术大学信息科学技术学院 合肥 230026

点云压缩 基于视频的点云压缩标准 占用图视频 视频上采样 加权失真损失

国家自然科学基金

62171429

2024

计算机科学
重庆西南信息有限公司(原科技部西南信息中心)

计算机科学

CSTPCD北大核心
影响因子:0.944
ISSN:1002-137X
年,卷(期):2024.51(1)
  • 1