:: AForge.NET Framework :: Articles :: Forums ::

Neural Networks for Prediction - error flatlined

The forum is to discuss topics from different artificial intelligence areas, like neural networks, genetic algorithms, machine learning, etc.

Neural Networks for Prediction - error flatlined

Postby stino » Tue May 21, 2013 7:03 pm


I have a data-set of chemical elements, for each element I have values like pH and some properties about the molecule. For each value, I also have a value "K" that was determined in an experiment.

I would like to create a neural network that takes the input properties and tries to predict the K value.
To do so, I inspired myself by this blog post: ... evelopment

The example on Cancer prediction looks fairly similar to what I'm trying to do, so i used an ActivationNetwork and BackwardsPropagationLearning.

But for any epoch i run, the error remains constant at the same value, no matter what learningRate or momentum i specify or even how many nodes per layer.
It's like the network had a cerebral arrest.. completely flat-lined.

For any configuration of neuron layers it outputs the same error (give or take 0.00005)

So would that suggest my data can't absolutely be learned from?

I'd guessed there should always be some order in chaos, so it should learn something, not what i expect, but something, shouldn't it?
Posts: 1
Joined: Tue May 21, 2013 6:45 pm

Return to Artificial Intelligence