Abstract
This research paper investigates the challenging task of training Generative Adversarial Networks (GANs) and addresses the issue of unstable training commonly encountered in this process. It emphasizes the significance of weight initialization in the generator and discriminator networks to stabilize GAN training. Specifically, the study focuses on Xavier Glorot initialization, a popular technique for weight initialization, and its potential to enhance the quality of generated data by promoting stability in GAN training. Through experimental analysis and evaluation, this research explores the effectiveness of Xavier Glorot initialization in achieving stable GAN training and improving the overall output quality of generated data.
-
Lattice | Vol 4 Issue 2₹1,679.00