Neural Network Concept in Artificial Intelligence
Since the 1980's there have been renewed research efforts dedicated to neural networks. The present interest is largely due to the difficult problems confronted by artificial intelligence, and due to the deeper understanding of how the brain works, the recent developments in theoretical models, technologies and algorithms. One motivation of neural network research is the desire to build a new breed of powerful computers to solve a variety of problems that have proved to be very difficult with conventional computers. Another motivation is the desire to develop cognitive models that can serve as an alternative way to artificial intelligence. Human brain functions have not yet been successfully simulated in an AI system. Some existing neural network, on the other hand, have shown potential for these abilities. Using self-organization capabilities, neural networks are able to acquire and organize knowledge through learning in response to external stimuli. This paper addresses many techniques used in neural networks and possible applications in artificial intelligence. Some generic information about hybrid intelligent systems is also provided.
There have been a variety of neural network models developed by researchers of different backgrounds, from different point of view and with different aims and applications. However, neural networks are emulation of biological neural systems. With such an emulation it is hoped that some brain abilities, such as generalization, and attention focusing, can be simulated. The neural network can be defined in many ways. From the structural point of view, a neural network can be defined as a directed network (or graph) with its nodes representing neurons. Generally speaking, neural networks are specified by 1) node (processing unit) characteristics, 2) network topology and, 3) learning paradigm. A neural network structure is based on: Node characteristics The nodes of a directed network are called processing units. The state of the unit represents the potential of a neuron. It can also be called activation. The state of a neuron is affected by its previous state, the total accumulated input signals, and the activation function. As the signal generated at a neuron cell body is transmitted down the axon and then distributed to the synapses, the property of this transmission path may affect the ultimate signal that arrives at the synapses. This can be described by the output function of the unit. 1. Input Function: The synapses and the signal modulations are simulated by links and link functions in neural networks. The signals received by the unit in a neural network may come from different types of output signals (binary, continuous, symbolic), undergo modulations of different link functions, and the link can be of different types (inhibitory, excitatory). Ports for input signals are called sites. All incoming links are impinging upon the...