Implementation of Backproposition Algorithm Exp-4
Experiment No 4:-Implementation of Backproposition
clc;
clear;
x=[0.6 0.8 0];
v=[2 1 0;1 2 2;0 3 1];
w=[-1 1 2];
b1=[0 0 -1];
b2=[-1];
t=[0.9];
a=0.3;
%Feed forward Stage
for j=1:3
x1=0;
for i=1:3
x1=x1+x(i)*v(i,j);
end;
zin(j)=b1(j)+x1; %Input layer
end;
z1=0;
for i=1:3
z(i)=1/(1+exp(-zin(i)));
z1=z1+z(i)*w(i);
end
yin=b2+z1;
y=1/(1+exp(-yin)); %Output layer
%BackPropogation of error and weight updates
f=y*(1-y);
d1=(t-y)*f;
for i=1:3
w2(i)=w(i)+a*d1*z(i);
din(i)=d1*w(i);
dj(i)=din(i)*z(i)*(1-z(i));
end;
for j=1:3
for i=1:3
v1(j,i)=v(j,i)+a*dj(i)*x(i);
b(i)=b1(i)+a*dj(i);
end;
end;
b2=b2+(a*d1);
disp('The weight matrix for input');
disp(v1);
disp('Bias matrix for input');
disp(b);
disp(' The weight and bias matrix for hidden layer');
disp(w2);
disp('bias for output');
disp(b2);
OUT PUT
zin =
2.0000 2.2000 0.6000
z =
0.8808 0.9002 0.6457
y =
dj =
-0.0083 0.0071 0.0361
b =
-0.0025 0.0021 -0.9892
The weight matrix for input
1.9985 1.0017 0
0.9985 2.0017 2.0000
-0.0015 3.0017 1.0000
Bias matrix for input
-0.0025 0.0021 -0.9892
The weight and bias matrix for hidden layer
-0.9792 1.0213 2.0153
bias for output
-0.9764
>>
No comments: