ADaSci Premium Membership fee will be revised from 1st March 2024. Lock your membership for 1 year at current price.

Enhancing GAN Training Stability through Xavier Glorot Initialization: A Solution to Unstable Training

Author(s): Sourabh Mehta


This research paper investigates the challenging task of training Generative Adversarial Networks (GANs) and addresses the issue of unstable training commonly encountered in this process. It emphasizes the significance of weight initialization in the generator and discriminator networks to stabilize GAN training. Specifically, the study focuses on Xavier Glorot initialization, a popular technique for weight initialization, and its potential to enhance the quality of generated data by promoting stability in GAN training. Through experimental analysis and evaluation, this research explores the effectiveness of Xavier Glorot initialization in achieving stable GAN training and improving the overall output quality of generated data.

The Chartered Data Scientist Designation

Achieve the highest distinction in the data science profession.

Elevate Your Team's AI Skills with our Proven Training Programs

Strengthen Critical AI Skills with Trusted Generative AI Training by Association of Data Scientists

Explore more from Association of Data Scientists