神经网络BP算法整理(1)的简单版代码实现

在编辑上一篇文章时,经常出现保存出错的现象,因此代码另写一篇:

%简单版的neural network,循环,偏置亦为矢量
% 参考:https://blog.csdn.net/wsxzhbzl/article/details/83537662
Input = [0.05,0.1];
Target = [0.01,0.99];
eta = 0.5;
Weights1 =[0.15, 0.25;0.20, 0.30];% rand(2,2);
bias1 = [0.35,0.35];% rand(1,2);
Weights2 = [0.40,0.50;0.45,0.55];%rand(2,2);
bias2 =[0.60,0.60];%rand(1,2);
iter=50000;
for i = 1:iter
%feedforward
net_h = Input*Weights1+bias1;
out_h = Sigmoid(net_h);

net_o = out_h*Weights2+bias2;
out_o = Sigmoid(net_o);

%backprogagation
% square error 
mse = 1/2*sum((out_o-Target).^2);
% weight2 update
%1  Weights2 update
mse2out_o = -(Target -out_o);
out_o2net_o =  (out_o.*(1-out_o)) ;
net_o2Weights2 = out_h;

delta2 =  (mse2out_o .* out_o2net_o)';
w2_delta = delta2 * net_o2Weights2;

b2_delta = delta2';

Weights2_new = Weights2-eta*w2_delta;
bias2_new = bias2-eta*b2_delta;
%2: :Weights1 update
mse2out_h = Weights2*delta2;
out_h2net_h = out_h.*(1-out_h);
net_h2Weights1 = Input;

delta1 = (mse2out_h.*out_h2net_h');
w1_delta = delta1*net_h2Weights1;
b1_delta = (Weights2*delta1)'.*out_h2net_h;

Weights1_new = Weights1 - eta*w1_delta;
bias1_new = bias1 - eta*b1_delta;

Weights1 = Weights1_new;
bias1 = bias1_new;
Weights2 = Weights2_new;
bias2 = bias2_new;
end
% str = sprintf('target:%d,out:%d,MSE:%d\n',Target,out_o,mse2out_o);
% disp(str);
disp('Target:');
Target
disp('Preditc:');
out_o
disp('MSE:')
mse2out_o

有待改进的地方:(1)偏置改为b1与b2独立更新
(2)实现mnist数据的分类网络

猜你喜欢

转载自blog.csdn.net/alansss/article/details/104344883
今日推荐