`What we will learn in this session`

?¶

- The Neuron
- The Activation Function
- How do Neural Network works ?
- How do Neural Networks Learn ?
- Gradient Descent
- Stochastic Gradient Descent
- Backpropagation

# 1 : `THE NEURON`

¶

A neuron (also called neurone or nerve cell) is a cell that carries electrical impulses. Neurons are the basic units of the nervous system. Every neuron is made of a cell body (also called a soma), dendrites and an axon. Dendrites and axons are nerve fibres.

- Dendrites : are the receivers of the signal to the neuron
- Axon : is the transmitter of the signal to the neuron

**As you can see Dendrites are connected to axon of the other neurons**

### How it works in Machine Learning? ?¶

### Lets talk about Weight

Weights are curcial to Artificial Neural Networks Because weights are how Neural Networks learn by adjusting the weights the Neural Network decides in every single case what signal is poor and what signal is not important to certain neuron.
When you’re training an artificial neural network you’re basically adjusting all of the weights. Here `Gradient Descent`

and `Back Propagation`

comes into play.

So our main motive is to minimize the **LOSS FUNCTION**. In case of Linear Regression our Loss Function is **Sum of Squared Error**

For Example:
Our Cost Function : y=x^2.
So what value of X we will take so that Y would be minimum.
So this problem will be solved by `Gradient Descent`

So lets take x=5, as Y=x^2 -> Y=25 So now we don’t know what value of X will take so that Y would be minimum if we have so many variables because inside we don’t know what is happening.

As from above if we plot we can easily judge if we go in left direction Y would be minimum, But how we come to know the direction in our Neural Networks.

So here you need to solve the Tangent of Slope which is nothing but dy/dx. So you need to take the derivation of x and can see that Tan Theta is Positivie so you need to move towards the direction of Tan Theta.

So we need to decrease the value of X i.e 5. So we will subtract 5 from any constant value which is nothing but your **Learning Rate**

dy/dx=2X

- 5 – 0.1 * dy/dx

5 – 0.1 * 2(5) = 4 which is your **Gradient Descent**