In the real world, it is not unusual to encounter datasets which are have imbalanced classes. This post discusses some strategies for dealing with such situations.
Building complex LaTeX documents can be quite painful. Thankfully, a little program called Tup can automate the build process for you.
How to address mode collapse, a commonly encountered failure case for GANs where the generator learns to produce samples with extremely low variety.
Deciphering the GAN objective used in practice, a detour through theory, and a practical reformulation of the GAN objective in a more general form.
A look at the objective function introduced in the InfoGAN paper, and why InfoGAN really isn't that complicated to implement.
Implementing neural networks can be intimidating at the start, with a daunting number of choices to make with no real sense of which option might work best. By listing my (opinionated) defaults found from experience, I hope to provide you, dear reader, with a starting point from which you can train a successful neural network.
Logarithms are surprisingly useful, but can be quite expensive to calculate. In this post we explore an interesting method for approximating the value of base 2 logarithms extremely quickly.