AForge.NET

  :: AForge.NET Framework :: Articles :: Forums ::

using backpropagation for single layer perceptron

The forum is to discuss topics from different artificial intelligence areas, like neural networks, genetic algorithms, machine learning, etc.

using backpropagation for single layer perceptron

Postby rads » Tue Jun 14, 2011 5:05 am

as the title said, can I use backpropagation learning library for single layer perceptron?

so there only 6 input and 26 output. that's all, no hidden layer.
rads
 
Posts: 10
Joined: Sat Apr 23, 2011 9:45 am

Re: using backpropagation for single layer perceptron

Postby andrew.kirillov » Tue Jun 14, 2011 9:04 am

You can try it actually ...

In general it should work. However it is a bit of an overkill. For a single layer network network updates are calculated simply by the PerceptronLearning class (AForge.NET network), which is a simple algorithm described here.
With best regards,
Andrew


Interested in supporting AForge.NET Framework?
User avatar
andrew.kirillov
Site Admin, AForge.NET Developer
 
Posts: 3451
Joined: Fri Jan 23, 2009 9:12 am
Location: UK

Re: using backpropagation for single layer perceptron

Postby rads » Tue Jun 14, 2011 11:37 am

sorry, for another question

in aForgenet do you have any bias node in ANN?

actually i try to compare result from aForgenet and manual calculation in excel. and it's different. so maybe in aforge has any bias node.

how can i access the first weight that already randomize?
rads
 
Posts: 10
Joined: Sat Apr 23, 2011 9:45 am

Re: using backpropagation for single layer perceptron

Postby andrew.kirillov » Tue Jun 14, 2011 12:15 pm

rads wrote:in aForgenet do you have any bias node in ANN?

The question is actually answered in documentation. There is a Threshold value, which is added to weighted sum of inputs. See Compute() for details.
With best regards,
Andrew


Interested in supporting AForge.NET Framework?
User avatar
andrew.kirillov
Site Admin, AForge.NET Developer
 
Posts: 3451
Joined: Fri Jan 23, 2009 9:12 am
Location: UK

Re: using backpropagation for single layer perceptron

Postby rads » Tue Jun 14, 2011 5:31 pm

thanks it's helps.
so in review

step 1.
sum of weight * input

step 2.
add the threshold value

step 3.
put it in activation function which is bipolar sigmoid.

step 4.
where can I get this? not the squared error output.
error = desired output - actual network output


thanks alot for your help
rads
 
Posts: 10
Joined: Sat Apr 23, 2011 9:45 am

Re: using backpropagation for single layer perceptron

Postby rads » Tue Jun 14, 2011 5:37 pm

i'm doing this to access the error.

first: try to get the output from the input
outputLooping = network.Compute(input[i]);


2nd: desired output - result from compute in step 1
outputLooping[j] = (valueOuput2[i][j] - outputLooping[j]);


but, when I compare to my manually calculate, is still different.

is it my method is false or not?
rads
 
Posts: 10
Joined: Sat Apr 23, 2011 9:45 am




Return to Artificial Intelligence

cron