15-11-2012, 12:11 PM
REPORT ON ARTIFICIAL NEURAL NETWORKS
ARTIFICIAL NEURAL NETWORKS.DOC (Size: 733.5 KB / Downloads: 19)
Abstract:
In this paper, we describe the artificial evolution of adaptive neural controllers for an outdoor mobile robot equipped with a mobile camera. The robot can dynamically select the gazing direction by moving the body and/or the camera. The neural control system, which maps visual information to motor commands, is evolved online by means of a genetic algorithm, but the synaptic connections (receptive fields) from visual photoreceptors to internal neurons can also be modified by Hebbian plasticity while the robot moves in the environment. We show that robots evolved in physics based simulations with Hebbian visual plasticity display more robust adaptive behavior when transferred to real outdoor environments as compared to robots evolved without visual plasticity. We also show that the formation of visual receptive fields is significantly and consistently affected by active vision as compared to the formation of receptive fields with grid sample images in the environment of the robot. Finally, we show that the interplay between active vision and receptive field formation amounts to the selection and exploitation of a small and constant subset of visual features available to the robot.
INTRODUCTION
Biological vision systems filter, compress, and organize the large amount of optical stimulation as electrical signals proceed from the retina towards deeper structures of the brain. This data reduction is achieved by a layered, distributed, and topologically organized set of neurons that individually respond to specific aspects of the optical stimulus. In mammals, for example, neurons in the early stage of the visual cortex selectively respond to particular features of the environment, such as oriented edges [9], which are linear combinations of the pattern of retinal activations. Neurons in later stages of the visual cortex respond to more complex patterns that also take into account the direction of movement of the stimulus and cannot easily be reduced to a linear combination of lower-level features [1].
The features that trigger the response of a neuron represent the receptive field of that neuron. The receptive fields of cortical visual neurons are not entirely genetically determined, but develop during the first weeks of the newborn baby and there is evidence that this process may already start before birth. Studies of newborn kitten raised in boxes with only vertical texture show that these animals do not develop as many receptive fields for horizontal features as kitten raised in normal environments [8] and therefore see the world in a different way. The development of visual receptive fields occurs through Hebbian synaptic plasticity, an adaptive process based on the degree of correlated activity of pre- and post-synaptic neurons .
METHOD
We use a Koala (K-Team S.A.) wheeled robot equipped with a pan/tilt camera (Sony EVI-D31) and infrared proximity sensors distributed around the body of the robot (Figure 1). Infrared sensors are used only by the operating system to detect collisions and reposition the robot between trials; their activation values are not given to the neural controller. The robot has three wheels on each side, but only the central wheel (which is slightly lower) is motorized (the remaining two
Experiments
We have carried out two sets of evolutionary experiments (Figure 3) to investigate whether onto genetic development of receptive fields provides an adaptive advantage in new environmental conditions—namely when transferred from a simulated to a real outdoor environment—and to investigate the interactions between evolution and learning. The evolutionary experiments are carried out in simulations and the best evolved individuals are tested in the real environment. In the first condition (“No learning”), which serves as a control condition, all synaptic connections are genetically encoded and evolved without learning. In the second condition (“Learning”), learning is enabled for the connections from visual neurons to hidden neurons which are not genetically encoded, but initialized to small random values. Connection strengths developed during learning are not transmitted to offspring. The fitness function selected robots for their ability to move straight forward as long as possible for the duration of the life of the individual. This is quantified by measuring the amount of forward rotation of the two motorized wheels of the robot. Each individual is decoded and tested for four trials, each trial lasting 400 sensory motor cycles of 300 ms. A trial can be truncated earlier if the operating system detects an imminent collision with infrared distance sensors.
Conclusion
The experimental results described in this article indicate that the interaction between learning and behavior within an evolutionary context brings a number of synergetic phenomena:
a) Behavior affects learning by selecting a subset of learning experiences that are functional to the survival task;
b) Learning affects behavior by generating selection pressure for actions that actively search for situations that are learned;
c) Learning contributes to the adaptive power of evolution (as long as the parameters subject to learning are not also genetically encoded) by coping with change that occurs faster than evolutionary time, as is the case of transfer from simulation to reality.
These results are promising for scalability and potential applications of evolutionary robotics in real-world situations where robots cannot possibly be evolved in real-time in the real world, but may have to evolve at least partly in simulations. They are also an indication that complex behavior can be generated by relatively simple control architectures with active behavior and local unsupervised learning that can be implemented in low-power, low-cost micro-controllers. The significant role of behavior in receptive field formation during learning is being increasingly recognized in the neuroscience community [12].