> From: "Musselwhite Charles" <CBAM195@psy.soton.ac.uk>
> Date: Tue, 21 May 1996 10:46:39 GMT
> Perceptrons were the first of many neural network models.
> They were proposed by McCullogh and Pitts in 1943. As with all
> computational models the perceptrons aim to show human cognitive
> abilities. They use a network system consisiting of elementary or
> neurone-like units or nodes that interlink together. The perceptron
> has two inputs (x1 and x2) and one output (y). The output is based on
> the inputs and is nearly always either +1, if the output is above a
> certain threshold or -1 should the output fall below a certain
The perceptron can have more inputs and outputs than that, but it only
has those two layers -- input and outpot -- and no "hidden" layers in
> In Best's article on Perceptrons he describes how
> Rosenblat (1958) found that perceptrons were capable of learning
> through feedback on trial and error tasks. When a correct response or
> output is found the connections are strengthened. The connections
> strength is decreased when a wrong response or output is given by the
> perceptron. It can, therefore, learn 'AND' and 'INCLUSIVE OR'
> problems :-
> INPUT (x1/x2) PROBLEM OUTPUT (y)
> 1 / 1 AND 0 = wrong
> 1 / 0 AND 0 = wrong
> 0 / 1 AND 0 = wrong
> 0 / 0 AND 1 = correct
> 1 / 1 INCLUSIVE OR 0 = wrong
> 1 / 0 INCLUSIVE OR 1 = correct
> 0 / 1 INCLUSIVE OR 1 = correct
> 0 / 0 INCLUSIVE OR 1 = correct
> In the 'AND' condition, after trial and error with
> feedback, it learns that when two '0' s are inputed this is the
> correct response. In the 'OR' condition it can learn through trial
> and error with feedback that it is the correct response when a '0' is
> inputed from either input - x1 or x2.
Ah, I see why you thought it could only have two inputs. But McClelland
and Rumelhart's past-tense learning net (see Pinker's critique of
neural nets) was also a perceptron, and it had many more inputs and
outputs then just two: all the present and past tense verbs of English
(present as input, past as output).
> However, the perceptron was criticised by Minsky because
> it fails to be able to perform simple 'EXCLUSIVE OR' tests (or XOR).
> It cannot learn the correct response to the problem when one input is
> different from the other. e.g (it cannot perform this) :-
Not when the inputs are different, but when the rule it must learn is
that it is in one category of either one or the other is on, but not
when both are on or both are off.
> INPUT (x1/x2) PROBLEM OUTPUT
> 1 / 1 XOR 0 = wrong
> 1 / 0 XOR 1 = correct
> 0 / 1 XOR 1 = correct
> 0 / 0 XOR 0 = wrong
> Up until this point, it had been thought that the
> perceptron was a very good model for how the brain worked. Following
> this finding by Minsky it could certainly not represent the brain.
> Many of the day to day functionings of humans and animals is based
> upon an 'EXCLUSIVE OR' problem. Later work did reveal, however, that
> by adding extra 'hidden' layers the perceptron could solve the
> 'EXCLUSIVE OR' problem.
Good job. For an "A," though, you need to go a bit beyond the immediate
demands of the question and show you can relate it to the other issues
that have been discussed in the course (such as neural nets in general,
and symbolic algorithms, the overall capacity of the mind, and the other
critiques that have been made of nets and their rivals).
This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:42 GMT