• Using SPSA for optimizing Neural Networks Being a backprop ninja aids you with a powerful tool for training Neural Networks. What if the loss function is non-differentiable? or very expensive to compute? SPSA comes handy in such cases. Let's dive deeper in SPSA in this post.

  • An overview of GANs Generative Adversarial Networks (GANs) is one of the hottest topics in deep learning currently. In this post I will describe GANs and the problems they are suffering from.

  • Some Complex Terms in Deep Learning for beginners You may have heard about divergences, manifolds, norms etc in deep learning. In this post I will simplify all these terms and briefly look at some of thier common forms and applications found in deep learning literature.

  • Gradient Descent vs Coordinate Descent You may have used coordinate descent for optmizing lots of ML algorithms like Lasso, L2 Norm SVM etc. In this blog we will compare coordinate descent with Gradient descent.

  • Welcome to Anshul's Blog I will be posting my experiments with Deep learning and machine learning algorithms. I wil be writing over my expedition on computer vision, speech processing and recognition and latest research.