Auto-Encoding Variational Bayes

Authors

  • Yankun Chen Bell Honors School, Nanjing University of Posts and Telecommunications
  • Jingxuan Liu International Elite Engineering School, East China University of Science and Technology
  • Lingyun Peng College of Cyberspace Security, Guangzhou University
  • Yiqi Wu International College of Engineering, Changsha University of Science and Technology
  • Yige Xu College of Electronic Science and Engineering, Jilin University
  • Zhanhao Zhang School of Mechanical Transport and Engineering, Chongqing University

DOI:

https://doi.org/10.61603/ceas.v2i1.33

Keywords:

Auto-Encoding Variational Bayes, Stochastic Gradient Variational Bayes, Dynamic Bayesian Network, Variational Autoencoders

Abstract

This paper employs the Auto-Encoding Variational Bayes (AEVB) estimator based on Stochastic Gradient Variational Bayes (SGVB), designed to optimize recognition models for challenging posterior distributions and large-scale datasets. It has been applied to the mnist dataset and extended to form a Dynamic Bayesian Network (DBN) in the context of time series. The paper delves into Bayesian inference, variational methods, and the fusion of Variational Autoencoders (VAEs) and variational techniques. Emphasis is placed on reparameterization for achieving efficient optimization. AEVB employs VAEs as an approximation for intricate posterior distributions.

Downloads

Published

2024-02-07

Issue

Section

Articles

How to Cite

Auto-Encoding Variational Bayes. (2024). Cambridge Explorations in Arts and Sciences, 2(1). https://doi.org/10.61603/ceas.v2i1.33