Site Loader

Neural networks have a long history of study since the influential idea of the neurophysiologist Heb (1949) about the structure and the behaviour of a biological neural system up to the recent model of an artificial neural system. The neurologists McCulloch and Pitts (1943) were the first who used formal logic in order to model networks with neurons as binary devices and fixed thresholds connected by synapses. A lot of distinguished researchers followed such as Rosenblatt (1958), who extended the idea of the neuron to the perceptron as an element of a self-organized computational network capable of learning by feedback and structural adaptation. \Moreover, Widrow and Hoff (1960) created and implemented the analogue electronic devices known as ADALINE (Adaptive Linear Element) and MADALINE (Multiple ADALINE) to imitate the neurons and the perceptrons. They used the Least Mean Squares algorithm (LMS) to train the devices to learn the pattern vectors presented to their inputs. Their model of the basic element of a NN is still used today. They considered the perceptron as an adaptive element bearing a resemblance to the neuron. A neuron, as the fundamental building block of a neural information processing system, is made up of a cell body with an inherent nucleus, dendrites that feed the external signals to the cell body and axons that carry the signals out of the cell to other cell bodies. In terms of analogue computational technology the core part of the element, called a perceptron, contains a summing element ? and a nonlinear element NL corresponds to the cell body, the multiple signal inputs are connected via adjustable weighting elements with the core part of the element corresponding to the dentrities and the signal output corresponds to the axons. In the core part of the perceptron a nonlinear function should be implemented in the  as its activation function. In addition, the fact that Block (1962) selected the binary step function for this purpose helped the evolution of the sigmoid activation function. The perceptron basically learns through a training process, based on a set of collected data. During the training, the perceptron adjusts its interconnection weights according to the data presented at its input. , a single perceptron alone cannot learn enough to be capable of solving more complex problems because it’s radius of computational action is rather restricted by the simplicity of it’s structure. This was disproved by building the multilayer perceptrons (MLPs) that, in addition to the perceptron input layer and output layer, also include so-called hidden layers inserted between the input and the output layer to form a cascaded network structure with extended connectionist capabilities.  In practice, one hidden layer is usually sufficient to build the network with the extended computational capabilities for solving the majority of practical problems. Only in some rare cases some additional hidden layers could be needed. This also holds in time series analysis and forecasting applications. \Afterwards, Minsky and Papert (1969) argued that the multilayer perceptron systems (MLP) had limited learning capabilities like the one-layer perceptron system. However, their theory was later rejected by Rumelhart and McClelland (1986). The latter actually showed that multilayer neural networks have nonlinear discriminating capabilities and are capable of learning more complex patterns by backpropagation learning. \From a practical aspect, neural networks have proven to be a powerful tool for features extraction, data classification and pattern recognition. Their strong capability of learning from observation data made them widely accepted by engineers and researchers as a tool for processing of experimental data. Another reason for this is the fact that they reduce enormously the computational efforts needed for problem-solving, while due to their massive parallelity, they considerably accelerate the computational process. Therefore, their migration to industry, business and financial engineering wasn’t too difficult to happen. For instance, the approaches based on neural networks and the methodologies used have efficiently solved the fundamental problems of time series analysis, forecasting, and prediction using collected observation data and the problems of on-line modelling and control of dynamic systems using sensor data.\

Post Author: admin


I'm Matt!

Would you like to get a custom essay? How about receiving a customized one?

Check it out