ADALINE

ADALINE (Adaptive Linear Neuron or later Adaptive Linear Element) is a single layer neural network. It was developed by Professor Bernard Widrow and his graduate student Ted Hoff at Stanford University in 1960. It is based on the McCulloch-Pitts neuron. It consists of a weight, a bias and a summation function.



Definition
Adaline is a single layer neural network with multiple nodes where each node accepts multiple inputs and generates one output. Given the following variables:
 * x is the input vector
 * w is the weight vector
 * n is the number of nodes
 * $$\theta$$ some constant
 * y is the output

then we find that the output is $$y=\sum_{j}^n x_j w_j + \theta$$. If we further assume that
 * $$ x_{n+1} = 1$$
 * $$w_{n+1} = \theta$$

then the output reduces to the dot product of x and w $$y=x_j \cdot w_j$$

Learning Algorithm
Let us assume:
 * $$\eta$$ is the learning rate (some constant)
 * d is the desired output
 * o is the actual output

then the weights are updated as follows $$w \leftarrow w + \eta(d-o)x$$. The ADALINE converges to the least squares error which is $$E=(d-o)^2$$. For a more comprehensive proof, see Adaline (Adaptive Linear)