Many new data scientists have voiced what they feel is the lack of a satisfying way to learn the concepts of back propagation/gradient computation in neural networks when taking undergrad level ML classes. So I thought I’d put together a number of useful learning resources to jump-start an understanding for this important process. The following list, curated from an informal Twitter poll, appears in no particular order.