as the title said, can I use backpropagation learning library for single layer perceptron?
so there only 6 input and 26 output. that's all, no hidden layer.




using backpropagation for single layer perceptron
6 posts
• Page 1 of 1
using backpropagation for single layer perceptronas the title said, can I use backpropagation learning library for single layer perceptron?
so there only 6 input and 26 output. that's all, no hidden layer.
Re: using backpropagation for single layer perceptronYou can try it actually ...
In general it should work. However it is a bit of an overkill. For a single layer network network updates are calculated simply by the PerceptronLearning class (AForge.NET network), which is a simple algorithm described here.
Re: using backpropagation for single layer perceptronsorry, for another question
in aForgenet do you have any bias node in ANN? actually i try to compare result from aForgenet and manual calculation in excel. and it's different. so maybe in aforge has any bias node. how can i access the first weight that already randomize?
Re: using backpropagation for single layer perceptron
Re: using backpropagation for single layer perceptronthanks it's helps.
so in review step 1. sum of weight * input step 2. add the threshold value step 3. put it in activation function which is bipolar sigmoid. step 4. where can I get this? not the squared error output. error = desired output  actual network output thanks alot for your help
Re: using backpropagation for single layer perceptroni'm doing this to access the error.
first: try to get the output from the input outputLooping = network.Compute(input[i]); 2nd: desired output  result from compute in step 1 outputLooping[j] = (valueOuput2[i][j]  outputLooping[j]); but, when I compare to my manually calculate, is still different. is it my method is false or not?
6 posts
• Page 1 of 1


