matlab - How to implement a neural network with a hidden layer? -


i trying train 3 input, 1 output neural network (with input layer, 1 hidden layer , output layer) can classify quadratics in matlab. attempting implement phases feed-forward, $x_i^{out}=f(s_i)$, $s_i={\sum}_{\substack{j\\}} w_{ij}x_j^{in}$ back-propagation ${\delta}_j^{in}=f'(s_i){\sum}_{\substack{j\\}} {\delta}_i^{out}w_{ij}$ , updating $w_{ij}^{new}=w_{ij}^{old}-\epsilon {\delta}_i^{out}x_j^{in}$, $x$ input vector, $w$ weight , $\epsilon$ learning rate.

i have troubles coding hidden layer , adding activation function $f(s)=tanh(s)$ since error in output of network doesn't seem decrease. can point out implementing wrong?

the inputs real coeffcients of quadratic $ax^2 + bx + c = 0$ , output should positive if quadratic has 2 real roots , negative if doesn't.

ntrain = 100; % training set noutput = 1; nsecondlayer = 7; % size of hidden layer (arbitrary) trainexamples = rand(4,ntrain); % independent random set of examples trainexamples(4,:) = ones(1,ntrain);  % set dummy input 1  t = sign(trainexamples(2,:).^2-4*trainexamples(1,:).*trainexamples(3,:)); % teacher provides every example %the student neuron starts random weights w1 = rand(4,nsecondlayer); w2 = rand(nsecondlayer,noutput); nepochs=0; nwrong = 1; s1(nsecondlayer,ntrain) = 0; s2(noutput,ntrain) = 0;   while( nwrong>1e-2 )  % more small number close 0     i=1:ntrain         x = trainexamples(:,i);         s2(:,i) = w2'*s1(:,i);         deltak = tanh(s2(:,i)) - t(:,i); % propagate         deltaj = (1-tanh(s2(:,i)).^2).*(w2*deltak); % propagate               w2 = w2 - tanh(s1(:,i))*deltak'; % updating         w1 = w1- x*deltaj'; % updating       end    output = tanh(w2'*tanh(w1'*trainexamples));    doutput = output-t;    nwrong = sum(abs(doutput));    disp(nwrong)    nepochs = nepochs+1           end nepochs 

thanks

after few days of bashing head against wall discovered small typo. below working solution:

clear % set parameters ninput = 4; % number of nodes in input noutput = 1; % number of nodes in output nhiddenlayer = 7; % number of nodes in th hidden layer ntrain = 1000; % size of training set epsilon = 0.01; % learning rate   % set inputs: random coefficients between -1 , 1 trainexamples = 2*rand(ninput,ntrain)-1; trainexamples(ninput,:) = ones(1,ntrain);  %set last input 1  % set student neurons both hidden , output layers s1(nhiddenlayer,ntrain) = 0; s2(noutput,ntrain) = 0;  % student neuron starts random weights both input , hidden layers w1 = rand(ninput,nhiddenlayer); w2 = rand(nhiddenlayer+1,noutput);  % calculate teacher outputs according quadratic formula t = sign(trainexamples(2,:).^2-4*trainexamples(1,:).*trainexamples(3,:));   % initialise values looping nepochs = 0; nwrong = ntrain*0.01; wrong = []; epoch = [];  while(nwrong >= (ntrain*0.01)) % long more 1% of outputs wrong     i=1:ntrain         x = trainexamples(:,i);         s1(1:nhiddenlayer,i) = w1'*x;         s2(:,i) = w2'*[tanh(s1(:,i));1];         delta1 = tanh(s2(:,i)) - t(:,i); % propagate         delta2 = (1-tanh(s1(:,i)).^2).*(w2(1:nhiddenlayer,:)*delta1); % propagate                w1 = w1 - epsilon*x*delta2'; % update         w2 = w2 - epsilon*[tanh(s1(:,i));1]*delta1'; % update     end      outputnn = sign(tanh(s2));     delta = outputnn - t; % difference between student , teacher     nwrong = sum(abs(delta/2));     nepochs = nepochs + 1;     wrong = [wrong nwrong];     epoch = [epoch nepochs]; end plot(epoch,wrong); 

Comments

Popular posts from this blog

java - Play! framework 2.0: How to display multiple image? -

gmail - Is there any documentation for read-only access to the Google Contacts API? -

php - Controller/JToolBar not working in Joomla 2.5 -