# Deep Learning Basics II

Last lecture focused on, Feed Forward Neural Networks and Universal Approximation Theorem. In today’s lecture we will learn about Back Propagation Algorithm, Gradient Descent Algorithm, Stochastic Gradient Descent Algorithm, types of functions. Back Propagation Algorithm It computes the gradient linearly in time and in the size of the network i.e. running time is O(V+E) where… Continue reading Deep Learning Basics II