The forum is to discuss topics from different artificial intelligence areas, like neural networks, genetic algorithms, machine learning, etc.
by cesarsouza » Sun Nov 01, 2009 5:54 pm
Hello,
I'm implementing the levenbergmarquardt algorithm for neural network learning. Partially, it works (!), but I'm using the nummerical approach for the compution of the Jacobian matrix (by using finite differences), which adds nummerical errors and instability in the convergence. Does anyone know how to compute the Jacobian matrix using the backpropagation method?
I've already read the references on Bishop and Haykin, but I'm still having some questions about the implementation.
Is anyone here familiar with this problem?
 edit:
Knowing this problem is very similar to the backpropagation learning, I tried to adapt the BackPropagationLearning class to compute the row vectors of the jacobian matrix. However, the AForge implementation uses Derivative2 to perform the calculus. Is this right? Wasn't is supposed to be the first derivative?
Thanks, Cesar

cesarsouza

 Posts: 63
 Joined: Fri Apr 10, 2009 3:41 pm
by andrew.kirillov » Mon Nov 02, 2009 11:20 am
Hello, cesarsouza wrote:Knowing this problem is very similar to the backpropagation learning, I tried to adapt the BackPropagationLearning class to compute the row vectors of the jacobian matrix. However, the AForge implementation uses Derivative2 to perform the calculus. Is this right? Wasn't is supposed to be the first derivative?
What does documentation say about this? Let's see ... So it is not actually a second derivative. Sorry for a bit confusing name, but ... at least it was documented. I would call method for calculating second derivative something like SecondDerivative()

andrew.kirillov
 Site Admin, AForge.NET Developer

 Posts: 3439
 Joined: Fri Jan 23, 2009 9:12 am
 Location: UK
by cesarsouza » Mon Nov 02, 2009 2:25 pm
Sorry about that Andrew I saw there was a Derivative and Derivative2 and assumed it was the second derivative. Totally missed the documentation. Anyways, the backpropagation calculates the partial derivative of the error function w.r.t weights. What I need to calculate (I think) is the partial derivative of the function w.r.t. weights. It can't be this difficult... I'm still trying! btw, http://crsouza.blogspot.com/2009/09/principalcomponentanalysisinc.html

cesarsouza

 Posts: 63
 Joined: Fri Apr 10, 2009 3:41 pm
by andrew.kirillov » Mon Nov 02, 2009 4:38 pm
From your web site: My initial idea was to implement PCA into the AForge.NET Framework. AForge.NET is an excellent Artificial Intelligence / Computer Vision framework written for .NET and developed mainly by Andrew Kirillov.
However, because it involved a lot of new additions to AForge, I think it was very difficult to review all the changes required to incorporate it directly into the source code of the project.
I remember we had an email chain about this around 1.5 years ago. However at that time it a bit hard time for AForge.NET framework project, since I was busy a bit with different stuff. As for now, PCA is in my TODO list. My primary interest to it is its application in computer vision area. I don’t mind of discussing the way we could introduce PCA n AForge.NET (I would really like it to have). But it may take a while to do this. If we think from a framework perspective, then it may be really nice is to extend Math namespace with matrix related classes (matrix class and different computations with matrices). Then, if we go this road, we may need to think about classes, which provide matrix operations using ATLAS or Intel MKL library (to make things faster). So it will be a lot of work. Someday, I hope, it will be started. So if you would like to join it, then you are welcome.

andrew.kirillov
 Site Admin, AForge.NET Developer

 Posts: 3439
 Joined: Fri Jan 23, 2009 9:12 am
 Location: UK
by cesarsouza » Mon Nov 02, 2009 5:26 pm
Sure.
But first, I'd consider wait for the next version (4) of the .NET framework, which will include an extensible parallelization support (which I've already seen in action inside some matrix classes) and I'm almost sure they'll also include their own Complex number data type. So, that's good news for numerical/scientific frameworks.
As for the framework point of view, introducing matrix classes would certainly be a good thing. But there is a lot of details involved on how they will interoperate with the rest of the framework. I know AForge is sometimes more focused in Computer Vision, but the problems I deal with are usually related with simpler tabular data (clinical data). I don't know if PCA should be implemented as a general matrix tool, or maybe specifically for images, or for both.
Btw, I'm not using any special Matrix classes anymore, as I commented on my blog post, because of this.
Another thing that I sometimes miss from AForge Framework is a proper AForge.Statistics namespace, which could hold those analysis such as PCA, ICA and LDA for general data, not only for images. I feel the current statistics approach, a static class inside AForge.Math, is a bit too simplistic, even if the two fields, Maths and Statistics, are intrinsically related.
Regards, Cesar

cesarsouza

 Posts: 63
 Joined: Fri Apr 10, 2009 3:41 pm
by andrew.kirillov » Wed Nov 04, 2009 9:38 am
Hello, cesarsouza wrote:But first, I'd consider wait for the next version (4) of the .NET framework, which will include an extensible parallelization support (which I've already seen in action inside some matrix classes) and I'm almost sure they'll also include their own Complex number data type. So, that's good news for numerical/scientific frameworks.
Here we already go different roads. Thinking from framework perspective, I would avoid switching to fresh .NET 4. Even now AForge.NET framework is still based on 2.0 version of .NET, not 3.5. There is nothing in 3.5 we could not live without. But switching to higher .NET versions require the rest of AForge.NET framework users to switch, which is not nice. Regarding paralleling computations. We also have a bit for it  Parallel.For. Yes, it is very simplified (light) version from what MS introduced, but it covers 99% use cases. And it does seem to be slower (at least on my tests). I was writing about it on CP: Parallel Computations in C#Complex class is not a showstopper as well  we have it. Obviously MS will introduce something, which better integrates into their framework. But ... they have manpower to do lots of stuff. cesarsouza wrote:I don't know if PCA should be implemented as a general matrix tool, or maybe specifically for images, or for both.
As for me it would be nice to have it generic, so it could be used for regular data and for images. cesarsouza wrote:Another thing that I sometimes miss from AForge Framework is a proper AForge.Statistics namespace, which could hold those analysis such as PCA, ICA and LDA for general data, not only for images. I feel the current statistics approach, a static class inside AForge.Math, is a bit too simplistic, even if the two fields, Maths and Statistics, are intrinsically related.
True. I don't mind having AForge.Math.Statistics namespace. Current static Statistics class is just some basic stuff, which should be probably renamed to free name for the namespace.

andrew.kirillov
 Site Admin, AForge.NET Developer

 Posts: 3439
 Joined: Fri Jan 23, 2009 9:12 am
 Location: UK
by cesarsouza » Fri Nov 20, 2009 2:54 pm
Hi Andrew, andrew.kirillov wrote:Hello, Thinking from framework perspective, I would avoid switching to fresh .NET 4. Even now AForge.NET framework is still based on 2.0 version of .NET, not 3.5. There is nothing in 3.5 we could not live without. But switching to higher .NET versions require the rest of AForge.NET framework users to switch, which is not nice.
I guess you are right. andrew.kirillov wrote:True. I don't mind having AForge.Math.Statistics namespace. Current static Statistics class is just some basic stuff, which should be probably renamed to free name for the namespace.
I was thinking about why not having a separate AForge.Statistics (or AForge.Math.Statistics) project, just like there is AForge.Neuro and AForge.Math. I think this would be insteresting because not everyone who is interested in, lets say, Fourier transformations will want to want to add dependencies on the entire statistics part of the assembly. By the way, getting back to the original topic, I've almost finished the implementation of the LevenbergMarquardt Algorithm with Bayesian regularization for neural networks. I think the code may still have some bugs and certainly there is room for improvement, but I believe those will be eventually solved after some more testing. Regards, Cesar

cesarsouza

 Posts: 63
 Joined: Fri Apr 10, 2009 3:41 pm
Return to Artificial Intelligence

