Multi-Layer Network Architectures - Artificial intelligence:
Perceptrons have restricted scope in the type of concepts they may learn - they may just learn linearly separable functions. However, we may think of constructing greater networks by building them out of perceptrons. In such type of greater networks, we call the step function units the perceptron units in multi-layer networks.
As with particular perceptrons, multilayer networks may be used for learning tasks. but, the learning algorithm that we discussed (the backpropagation routine) is resultant mathematically, by using differential calculus. The derivation relies on having differentiable threshold function, which effectively rules out by using perceptron units if we want to be certain that backpropagation works right. The step function in perceptrons is not continuous, so non-differentiable. An alternative unit was therefore selected which had same properties to the step function in perceptron units, but which was differentiable.