Neural Networks
In order to get into the workings behind a neural network, let's build up from the concepts of logistic regression. Remember this flow chart from logistic regression? This, Logistic regression, can be visualized as a very basic neural network : Where one neuron computes z and then applies the $ Z = w^TX+b $ and then applies a non-linear activation function $ A = g(Z)$ (where $g(z) = \sigma(z)$ in this case) A neural network is basically a logistic regression on steroids! This is what a typical neural network looks like : It contains 3 layers; The input layer, The hidden layer(s), and The output layer. In a neural network, the input layer contains the values x, and the output layer contains the predicted value of y. The layers in between are called "hidden layers" because the true value of these nodes are not observed. Each neuron in a neural network does the exact same thing here as well (calculates $ Z = w^TX+b $ and then applies a non-linear activation f...