In the real world, it is not unusual to encounter datasets which are have imbalanced classes. This post discusses some strategies for dealing with such situations.
How to address mode collapse, a commonly encountered failure case for GANs where the generator learns to produce samples with extremely low variety.
Deciphering the GAN objective used in practice, a detour through theory, and a practical reformulation of the GAN objective in a more general form.
A look at the objective function introduced in the InfoGAN paper, and why InfoGAN really isn't that complicated to implement.
Implementing neural networks can be intimidating at the start, with a daunting number of choices to make with no real sense of which option might work best. By listing my (opinionated) defaults found from experience, I hope to provide you, dear reader, with a starting point from which you can train a successful neural network.