![]() |
Table of Contents ![]() ![]() ![]() ![]() ![]() |
|
See also: multi-layer perceptrons | ![]() ![]() |
The central paradigm of neural networks is based on local computing. This means that neural networks gain their power from connecting several processing units, which have only very restricted capabilities and do not know anything about the "higher" goals of the network. The "know-how" of the network is thus not contained in its processing units but in the connections.
The actual implementation of a processing unit may differ widely and depends on the model to be used. The example below shows a processing unit as it is used in multi-layer perceptrons.
This processing unit consists of three parts:
Distinguishing between activation and output function is somewhat
arbitrary and is often neglected by setting one of these functions to the
identity function. It is important to note that one of these functions
has a non-linear response curve, most often a sigmoid curve.
Last Update: 2006-Jän-17