Auto-Encoding Variational Bayes
DOI:
https://doi.org/10.61603/ceas.v2i1.33Keywords:
Auto-Encoding Variational Bayes, Stochastic Gradient Variational Bayes, Dynamic Bayesian Network, Variational AutoencodersAbstract
This paper employs the Auto-Encoding Variational Bayes (AEVB) estimator based on Stochastic Gradient Variational Bayes (SGVB), designed to optimize recognition models for challenging posterior distributions and large-scale datasets. It has been applied to the mnist dataset and extended to form a Dynamic Bayesian Network (DBN) in the context of time series. The paper delves into Bayesian inference, variational methods, and the fusion of Variational Autoencoders (VAEs) and variational techniques. Emphasis is placed on reparameterization for achieving efficient optimization. AEVB employs VAEs as an approximation for intricate posterior distributions.
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Yankun Chen, Jingxuan Liu, Lingyun Peng, Yiqi Wu, Yige Xu, Zhanhao Zhang

This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.





