### Subscribe Us # Perceptron Algorithm | Pegasos Algorithm | Support Vector Machine | Linear Classifier Among Dataset | Deep Learning | Machine Learning

## PERCEPTRON ALGORITHM

Y=θ.X + θ0

### It is also an iterative method.We'll start with θ = 0 vector and θ0=0 In each iteration ,for every data point ,

•  if ((yi)*(θ.xi0)<=0) :
• Update (θ=θold + (yi.xi))
(θ0+ yi)

### H(z)=max{0,1-z}, z=(yi(θ.Xi+θ0))

means , Loss will be zero if z>1 else 1-z

### J = Loss + Regularization

J(θ,θ0) = (1/n)[Σin=1H(yi.Xi0)) + (λ/2)||θ||]

### So, in each iteration we will consider only one point and will calculate cost only for that and update.

J(θ,θ0) = H(yi.Xi0i)) + (λ/2)||θ||
Since, ∂H(z)/∂θ = {0 if z>1 ,-∂z/∂θ otherwise}
So,the iteration formula from stochastic Gradient Descent:
if z<=1:

θnewold - η[-yi.x+λ.θold]
θnewold +ηyi.xi  -η.λ.θold
θonewoold  η.yi

else:

θnewold  λ.θold
θonewoold
We will make η getting decreased as iteration will increase.
Try to implement and Uderstand Below code

If we remove the regularization term , and take η=1 , it becomes Simple Perceptron Algorithm.