Example Multi-layer ANN with Sigmoid Units:
However we will concern ourselves here that with ANNs containing only one hidden layer and as this makes describing the backpropagation routine easier. So notice that networks whereas you can feed in the input on the left and propagate it forward to get an output that are called as feed forward networks.
In below is such an ANN in which with two sigmoid units in the hidden layer. In fact the weights have been set arbitrarily between all the units.
However note that the sigma units have been identified by sigma signs in the node on the graph. So as we did with perceptrons then we can give this network an input and determine the output. In fact we can also look to see such units "fired", i.e., had a value closer to 1 than to 0.