> From: "Mason Corinne" <CJM395@psy.soton.ac.uk>
> Date: Tue, 28 May 1996 09:41:06 GMT
> 
> Exclusive Or (XOR) refers to a situation whereby a decision is based 
> on one, and only one, of two conditions being satisfied. For 
> instance, if I dislike crowds I may decide to go to the beach if it 
> is sunny, or if it is a bank holiday, but not both.
> This is a Boolian logic function, and if each condition is assigned
> a value of 1 if it is met, and 0 if it is not met, then a table can 
> be drawn up representing this, with  values or 0 and 1 indicating
> the decision:
> 
>       1        0
> 1      0        1
> 0      1        0
> 
> This is a difficult concept for the human mind to grasp; we usually 
> function using and/or conditions. For a network, it constitutes the 
> basis of the XOR problem. How can a network be constructed so that 
> it will arrive at the the correct output when the input data is based 
> on XOR reasoning, and has nothing in common?
Nothing in common except the feature "A XOR B"...
> A network consists of a set of units that are each connected to all 
> the units of the next layer, but can only communicate with 
> each other by means of very simple signals. 
Not sure what that last phrase meant: compared to what?
> The basic 2 layer perceptron is not capable of such processing, and 
> this was Minsky's critique. What is required to accomplish XOR 
> processing is a network such that 
> if the input is a pair of binary digits (which can be 0 or 1),
> and the output is another binary pair, for the output value to be 1 
> of one of the inputs is 1, but 0 if neither or both is 1.
> The answer to how this kind of decision can be made lies in the 
> network having one or more hidden layers between the input and output 
> layers. The units of the hidden layer are isolated from the networks 
> environment, and the connections pass from the input layer through 
> the hidden layer to the output layer. Each unit at a level is 
> connected to all units of the next higher layer. 
> The units can only transmit simple numerical values -  the input 
> receives 1 or 0 and sends an output value of 1 or 0 along each of its 
> connections with other units. Each connection has a weight which is 
> either positive, negative or 0, and each unit has a bias. The
> incoming value is multiplied by the weight on each of 
> its connections, and the sum of the products is added to the bias 
> that is associated with each unit. The resulting value is then 
> assigned an activation value of 0 or 1, according to the threshold of the 
> unit, and if the unit is thus activated it continues to propogate its 
> value to the output layer via another weighted connection.
> Another advantage of this system is that changing the weights 
> allows a network to learn from past experience, 
> and thus improve its performance through the process of 
> backpropogation.   
Good! To put it over the top, integrate it with the bigger issues about
nets, symbols, categorisation, reverse engineering.
This archive was generated by hypermail 2b30 : Tue Feb 13 2001 - 16:23:47 GMT