Over fitting Considerations :
Hence in left unchecked there backpropagation in multi-layer networks can be highly susceptible to overfitting itself to the training examples. But there in following graph plots the error on the training and test set as the number of weight updates increases. So here it is typical of networks left to train unchecked.
Alternatively or alarmingly there even though the error on the training set continues to gradually decrease thus the error on the test set really begins to increase towards the end. However this is clearly overfitting so it relates to the network beginning to find and fine-tune to ideosyncrasies in the data before to general properties. Here now given this phenomena that would be unwise to need some kind of threshold error for backpropagation as the termination condition.
Alarmingly in this cases when the number of training examples is high then one antidote to overfitting is to split the training examples such with a set to use to train the weight or a set to hold back as an internal validation set. However this is a mini-test set that can be utilised to keep the network in check as: whether the error on the validation set reaches a minima after then begins to increase so it could be that overfitting is beginning to occur.