Site Title Gujarat Techno
  • Home
  • Our Work
  • Code
  • Welcome

Code

Single Layer Neural Network:(ORANDXOR)

This is a demo of Single Layer Neural Network (NN) with supervised learning. Single Layer Neural Network can solve linear problem or it can be used to linearly separate set of data. The example shown below uses a Single layer Neural Network with two inputs, one biasing input neuron and one output neuron. Neurons are connected through weights that are trained using delta rule. We apply random inputs and train with functions OR, AND and XOR in three different NNs. The net's learns OR and AND successfully where as XOR error does not converge. XOR being a non linear function of inputs, it cannot be learned by single layer NN. We need Multi Layer Neural Network to train non linear functions like XOR.

Straight Line

This is demo of application of a Single Layer Neural Network with supervised learning. A rectangular space is separated by straight line. The coordinates of both partitions are separated using delta rule in a single layer neural network with same architecture as in case of OR and AND data except the inputs are real number instead of binary.

Multi-Layer Perceptron: XOR Data

This is a case of non-linear learning function. Here a two layered Neural Network is used with supervised learning algorithm. The training algorithm used is gradient descent error neural net back propagation (NNBP). Input-Output training data set is XOR function. Initially random weights are used to populate all the connections. Output is calculated and compared with desired XOR output. This error function is also known as cost function is used to correct all the weights at different layers. The sensitivity of the weights to the output error is calculated by first partial derivative of cost function with respect to the respective weights. Correction is applied proportional to error sensitivity with a learning coefficient called often ETA which is usually between 0 to 1.

Multi-Layer Perceptron: Circle Data

This is demo of application of a Two Layer Neural Network with supervised learning. A rectangular space is separated by Circle. The coordinates of both partitions are separated using back propagation algorithm. It uses same architecture as in case of XOR data except the inputs are real number instead of binary and has 12 hidden neurons.

Back Propagation Algorithm (BP)

A multipurpose Back Propagation algorithm is implemented in C code which could be adapted for many applications. It is three layer feed forward architecture neural network with normalized real value input-output. The demo code uses XOR dataset for four input neurons, four hidden neurons and two output neurons.

Kohonen Neural Network

Kohonen Neural Network is used to classify input dataset using unsupervised learning algorithm. It uses single layer network with outputs representing number of classes. The example shown here classifies English character sets being classified by their shape. In Kohonen's classification, the neighboring classes are similar in shape. The input-output dataset of the Kohonen network after training is shown below.

Hopfield Neural Network

Hopfield Neural Network is recursive network with binary output. It is used as memory model to recall a bit set from partial input set. The same network can memorize several binary images. The demonstration code shows how a network with 64 input-output (8x8) as English character set recalls a character from noisy input.

Site Name
[email protected]