AForge.NET

  :: AForge.NET Framework :: Articles :: Forums ::

Backpropagation NN : output returns the same . c#

The forum is to discuss topics from different artificial intelligence areas, like neural networks, genetic algorithms, machine learning, etc.

Backpropagation NN : output returns the same . c#

Postby nguyendo891 » Mon Jul 04, 2016 2:38 pm

Hi,
I have read some post about this problem. My case is I have 2 different images (black and white color) as inputs , and they are binarized to {1,0} like:
double [][] input = double[2][]{
new double[9600]{0,0,0,1....},
new double[9600]{0,1,0,0,0....}
};
The size of the input is 9600.
and the output is :
double[][] output= double[2][]{
new double[1]{0},
new double[1]{1}
};
my setup network : learning rate =0.01 , momentum=0.01 ,iteration =10000;

m_NeuralNetwork = new ActivationNetwork(new BipolarSigmoidFunction(2), 9600, 2, 1);

then I runEpoch() to get the error after 10000 times approx 2.5 .Then I test
with
var result1 =m_NeuralNetwork.Compute( new double[9600]{0,0,0,1....}); // for the first image , it returns 0.5
var result 2= m_NeuralNetwork.Compute( new double[9600]{0,1,0,0,0....}); // for the second image , it returns also 0.5 exactly the same

I have tried with adjusting the alpha of SigmoidFunction to 0.01 or0.1 but the 2 result values are still the same.
Do you have any ideas ?
Thanks for reading.
nguyendo891
 
Posts: 7
Joined: Mon Jul 04, 2016 2:09 pm

Re: Backpropagation NN : output returns the same . c#

Postby Lawrence » Sat Jul 09, 2016 9:39 am

Hi,

Have you tried a smaller learning rate? 1.01 is quite high. Perhaps 1.00001?

With a high learning rate, even with 10K iterations you are likely to end up at the same solution. That might be part of your problem.
Lawrence
 
Posts: 24
Joined: Tue May 17, 2016 11:32 am

Re: Backpropagation NN : output returns the same . c#

Postby nguyendo891 » Mon Jul 11, 2016 12:23 pm

Hi Lawrence,
I have tried to adjust the value of output and input from 1 to 0.01 like :

double [][] input = double[2][]{
new double[9600]{0,0,0,0.01....}, //0.01 instead of 1
new double[9600]{0,0.01,0,0,0....}
};

The size of the input is 9600.
and the output is :

double[][] output= double[2][]{
new double[1]{0},
new double[1]{0.01} // instead of 1
};

and reduced the learning rate from 0.01 to 0.0001 and it works.
But now when I test with a thousand inputs .It doesn't work again.
double [][] input = double[1000][]{ //instead of 2
new double[9600]{0,0,0,0.01....}, //0.01 instead of 1
new double[9600]{0,0.01,0,0,0....}
};
How should I adjust these values (learning rate , momentum , input, output....) to get the accepted result when the number of inputs is increased ?
nguyendo891
 
Posts: 7
Joined: Mon Jul 04, 2016 2:09 pm

Re: Backpropagation NN : output returns the same . c#

Postby Lawrence » Mon Jul 11, 2016 1:25 pm

Hi,

I a not sure you have the array structure correct.

Assume you have 1000 samples of 2 inputs in 2 separate arrays A + B.

Then:

double [][] input = new double[1000][];

for(int 1=0; i<1000;i++)
{
input[i] = new double[2];
input[i][0] = A[i];
input[i][1] = B[i];
}

Does your structure look like this?
Lawrence
 
Posts: 24
Joined: Tue May 17, 2016 11:32 am

Re: Backpropagation NN : output returns the same . c#

Postby nguyendo891 » Tue Jul 12, 2016 1:59 pm

Hi Lawrence,
I think your description is totally different from mine. In my case , "2" means number of samples.
I mean instead of double [][] input = new double[2][];
now I extend the to double [][] input = new double[1000][];
and the structure is :
double [][] input = new double[1000][]{
new double[9600]{0,0,0,0.01....},
new double[9600]{0,0.01,0,....},
new double[9600]{0,0,0,....},
new double[9600]{0,0.01,0,0.01,...},
....
new double[9600]{0.01,0,0.01,...},
}// 1000 times new double[9600]{....}

double [][] output= new double[1000][]{
new double[2]{0},
new double[2]{0.01},
new double[2]{0.01},
new double[2]{0.01},
new double[2]{0},
new double[2]{0},
...

}//1000 times new double[2]{..}

and I dont need to separate into array A or B in inside of input , because when I train each of new double[9600]{0,0,0,0.01....} will be mapped with the correspond output :new double[2]{..} . Then learning.RunEpoch(input, output) will run one by one pair of input and output.
nguyendo891
 
Posts: 7
Joined: Mon Jul 04, 2016 2:09 pm

Re: Backpropagation NN : output returns the same . c#

Postby Lawrence » Wed Jul 13, 2016 10:01 am

OK,

So you have have a 1000 samples of 9600 Inputs?

That's a lot of inputs! I have have millions of samples BUT only maximum 5 inputs - so my network is very different from yours...
Lawrence
 
Posts: 24
Joined: Tue May 17, 2016 11:32 am

Re: Backpropagation NN : output returns the same . c#

Postby nguyendo891 » Wed Jul 13, 2016 11:48 am

yes , I want to train 1000 images which have 94*100 pixels , and each pixel contains 1 or 0 ( or (0 or 0.001)).
nguyendo891
 
Posts: 7
Joined: Mon Jul 04, 2016 2:09 pm




Return to Artificial Intelligence