首页|Unsupervised Cross-Media Hashing Learning via Knowledge Graph

Unsupervised Cross-Media Hashing Learning via Knowledge Graph

扫码查看
With the rapid growth of multimedia data,cross-media hashing has become an important tech-nology for fast cross-media retrieval.Because the manual annotations are difficult to obtain in real-world applica-tion,unsupervised cross-media hashing is studied to ad-dress the hashing learning without manual annotations.Existing unsupervised cross-media hashing methods gen-erally focus on calculating the similarities through the fea-tures of multimedia data,while the learned hashing code cannot reflect the semantic relationship among the multi-media data,which hinders the accuracy in the cross-me-dia retrieval.When humans try to understand multime-dia data,the knowledge of concept relations in our brain plays an important role in obtaining high-level semantic.Inspired by this,we propose a knowledge guided unsuper-vised cross-media hashing(KGUCH)approach,which ap-plies the knowledge graph to construct high-level semant-ic correlations for unsupervised cross-media hash learning.Our contributions in this paper can be summarized as fol-lows:1)The knowledge graph is introduced as auxiliary knowledge to construct the semantic graph for the con-cepts in each image and text instance,which can bridge the multimedia data with high-level semantic correlations to improve the accuracy of learned hash codes for cross-media retrieval.2)The proposed KGUCH approach con-structs correlation of the multimedia data from both the semantic and the feature aspects,which can exploit com-plementary information to promote the unsupervised cross-media hash learning.The experiments are conduc-ted on three widely-used datasets,which verify the effect-iveness of our proposed KGUCH approach.

Cross-media hashingKnowledge graphUnsupervised learning

YE Zhaoda、HE Xiangteng、PENG Yuxin

展开 >

Wangxuan Institute of Computer Technology,Peking University,Beijing 100091 China

National Natural Science Foundation

2022

电子学报(英文)

电子学报(英文)

CSTPCDSCIEI
ISSN:1022-4653
年,卷(期):2022.31(6)
  • 1
  • 28