Robotics & Machine Learning Daily News2024,Issue(Oct.7) :63-63.

Study Findings from Nagoya University Broaden Understanding of Neural Computatio n (Latent Space Bayesian Optimization With Latent Data Augmentation for Enhanced Exploration)

Robotics & Machine Learning Daily News2024,Issue(Oct.7) :63-63.

Study Findings from Nagoya University Broaden Understanding of Neural Computatio n (Latent Space Bayesian Optimization With Latent Data Augmentation for Enhanced Exploration)

扫码查看

Abstract

Investigators publish new report on ne ural computation. According to news reporting out of Nagoya University by NewsRx editors, research stated, "Latent space Bayesian optimization (LSBO) combines g enerative models, typically variational autoencoders (VAE), with Bayesian optimi zation (BO), to generate de novo objects of interest." The news editors obtained a quote from the research from Nagoya University: "How ever, LSBO faces challenges due to the mismatch between the objectives of BO and VAE, resulting in poor exploration capabilities. In this article, we propose no vel contributions to enhance LSBO efficiency and overcome this challenge. We fir st introduce the concept of latent consistency/inconsistency as a crucial proble m in LSBO, arising from the VAE-BO mismatch. To address this, we propose the lat ent consistent aware-acquisition function (LCA-AF) that leverages consistent poi nts in LSBO. Additionally, we present LCA-VAE, a novel VAE method that creates a latent space with increased consistent points through data augmentation in late nt space and penalization of latent inconsistencies. Combining LCA-VAE and LCA-A F, we develop LCA-LSBO."

Key words

Nagoya University/Computation/Neural C omputation

引用本文复制引用

出版年

2024
Robotics & Machine Learning Daily News

Robotics & Machine Learning Daily News

ISSN:
段落导航相关论文