首页|Dynamic embeddings for efficient parameter learning of Bayesian network with multiple latent variables

Dynamic embeddings for efficient parameter learning of Bayesian network with multiple latent variables

扫码查看
Latent variables (LVs), representing the unobservable abstract concepts, such as patient disease and customer credit, play an important role in the simplification of network structure and improving the interpretability of Bayesian network (BN). However, LVs incorporated into BN lead to missing probability parameters due to the missing observation. As the classic method for parameter estimation for the situation with LVs, the expectation- maximization (EM) suffers high complexity and slow convergence. To this end, we propose dynamic embeddings for parameter learning of BN with LVs. Firstly, we reconstruct the Estep of EM and propose to use the dynamic embeddings to calculate the weights of fractional samples that could reduce the computational complexity of parameter learning. Secondly, we propose to construct the point mutual information (PMI) matrix to represent directed weighted graphs (DWGs) transformed from the updated parameters. Thirdly, the incremental singular value decomposition (SVD) is adopted to generate dynamic embeddings while capturing the updated parameters and preserving BN's graphical structure. Experimental results show that our proposed methods are efficient and effective. On real-world BNs, the efficiency, convergence and accuracy of our method outperform those of the state-of-the-art methods for parameter learning of BN with multiple LVs. (C) 2022 Published by Elsevier Inc.

Bayesian networkLatent variableParameter learningGraph embeddingSingular value decompositionIMPUTATION

Qi, Zhiwei、Yue, Kun、Duan, Liang、Hu, Kuang、Liang, Zhihong

展开 >

Yunnan Univ

Southwest Forestry Univ

2022

Information Sciences

Information Sciences

EISCI
ISSN:0020-0255
年,卷(期):2022.590
  • 2
  • 37