AForge.NET

  :: AForge.NET Framework :: Articles :: Forums ::

Imlementation of backpropagation

Forum to discuss AForge.NET Framework, its features, API, how-tos, etc.

Imlementation of backpropagation

Postby denton » Mon May 12, 2014 12:28 pm

Hello,

i am sorry for my english. I have written my own neural network library for my bachelor thesis. In one chapter i have to compare my library with others libraries. Its a problem, because implementation of backpropagation in AFoge.NET is different. Which variation of backpropagation implements the AForge.NET? I dont understand the weight update of neuron in each step:

neuronWeightUpdates[j] = learningRate * (
momentum * neuronWeightUpdates[j] +
( 1.0 - momentum ) * error * input[j]
);

All sources, which i known, refers to update weight like this:
neuronWeightUpdates[j] = learningRate * error * input[j] + momentum * neuronWeightUpdates[j];

This weights update also use Encog implentation of backpropagation. I have to describe in my comparsion chapter, why my library dont give the same results as AForge.NET. Its strange to say both libraries (my and AForge.NET) implements the backpropagation and dont give the same results. So i would like to know which variation of backpropagation is in AForge.NET or which resource describe this version of backpropagation.
denton
 
Posts: 1
Joined: Mon May 12, 2014 12:10 pm



Return to AForge.NET Framework

cron