17-11-2012, 04:39 PM
The Structure of Intelligence A New Mathematical Model of Mind
A Structure of Intelligence.pdf (Size: 473.04 KB / Downloads: 23)
Introduction
Psychology versus Complex Systems Science
Over the last century, psychology has become much less of an art and much more of a science.
Philosophical speculation is out; data collection is in. In many ways this has been a very positive
trend. Cognitive science (Mandler, 1985) has given us scientific analyses of a variety of
intelligent behaviors: short-term memory, language processing, vision processing, etc. And
thanks to molecular psychology (Franklin, 1985), we now have a rudimentary understanding of
the chemical processes underlying personality and mental illness. However, there is a growing
feeling -- particularly among non-psychologists (see e.g. Sommerhoff, 1990) -- that, with the
new emphasis on data collection, something important has been lost. Very little attention is paid
to the question of how it all fits together. The early psychologists, and the classical philosophers
of mind, were concerned with the general nature of mentality as much as with the mechanisms
underlying specific phenomena. But the new, scientific psychology has made disappointingly
little progress toward the resolution of these more general questions.
One way to deal with this complaint is to dismiss the questions themselves. After all, one might
argue, a scientific psychology cannot be expected to deal with fuzzy philosophical questions that
probably have little empirical significance. It is interesting that behaviorists and cognitive
scientists tend to be in agreement regarding the question of the overall structure of the mind
Mind and Computation
The analysis of mind to be given in the following chapters is expressed in computational
language. It is therefore implicitly assumed that the mind can be understood, to within a high
degree of accuracy, as a system of interacting algorithms or automata. However, the concept of
"algorithm" need not be interpreted in a narrow sense. Penrose (1989), following Deutsch
(1985), has argued on strictly physical grounds that the standard digital computer is probably not
an adequate model for the brain. Deutsch (1985) has proposed the "quantum computer" as an
alternative, and he has proved that -- according to the known principles of quantum physics -- the
quantum computer is capable of simulating any finite physical system to within finite accuracy.
He has proved that while a quantum computer can do everything an ordinary computer can, it
cannot compute any functions besides those which an ordinary computer can compute (however,
quantum computers do have certain unique properties, such as the ability to generate "truly
random" numbers). Because of Deutsch's theorems, the assertion that brain function is
computation is not a psychological hypothesis but a physical, mathematical fact. It follows that
mind, insofar as it reduces to brain, is computational.
QUANTUM COMPUTATION
If the universe were fundamentally deterministic, the theory of stochastic computation would
be superfluous, because there could never really be a stochastic computer, and any apparent
randomness we perceived would be a consequence of deterministic dynamics. But it seems that
the universe is not in fact deterministic. Quantum physics tells us that chance plays a major role
in the evolution of the physical world. This leads us to the question: what kind of computer can
simulate any physical system? What kind of computer can follow any precise set of physical
instructions?
It turns out that neither a Turing machine nor a stochastic Turing machine has this property.
This puts the theory of computation in a very uncomfortable situation. After all, the human brain
is a physical system, and if computers cannot simulate any physical system, there is no reason to
simply assume that they can simulate the human brain. Perhaps they can, but there is no reason
to believe it. Clearly it would be desirable to design a computer which could simulate an
arbitrary physical system. Then we would have a much better claim to be talking about
computation in general.
Computational Complexity
Computational complexity theory, also called algorithmic complexity theory, seeks to answer
two different kinds of questions: "How hard is this problem?", and "How effective is this
algorithm at solving this problem?". A number of difficult issues are involved here, and it is not
possible to delve into them deeply without sophisticated mathematics. Here we shall only scratch
the surface.
Questions of computational complexity are only meaningful in the context of a general theory
of computation. Otherwise one can only ask "How hard is this problem for this computer?", or
"How hard is this problem for this particular person?". What lets us ask "How hard is this
problem?", without any reference to who is actually solving the problem, is a theory which tells
us that problems are basically just as hard for one computer as for another. Here as in so many
other cases, it is theory which tells us what questions to ask.
According to the theory of Turing machines, any sufficiently powerful computer can simulate
any other computer. And this is not merely a theoretical illusion. In practice, computers such as
PCs, mainframes and supercomputers are highly flexible. An IBM PC could be programmed to
act just like a MacIntosh; in fact, there are software packages which do something very close to
this. Similarly, a MacIntosh could be programmed to act just like an IBM. Turing proved that
there is a program which tells a computer, given appropriate information, how to simulate any
other computer. Therefore, any computer which is powerful enough to run this program can act
as a universal Turing machine. If it is equipped with enough memory capacity -- e.g. enough disk
drives -- it can impersonate any computer whatsoever.
AVERAGE-CASE ANALYSIS
Note that the quantity A(n) is defined in terms of "worst-case" computation. It is the longest that
computer A takes to solve any problem instance of size n. Any computer worth its salt can sort
the list {1,2,3,4,5,6,7,8,9,10} faster than the list {5,7,6,4,10,3,8,9,2,1}. But A(n) ignores the easy
cases. Out of all the possible instances, it only asks: how hard is the hardest?
For some applications, this is a useful way to look at computation. But not always. To see
why, consider the following well-known problem. A salesman, driving a jeep, must visit a
number of cities in the desert. There are no mountains, rivers or other obstructions in the region.
He wants to know what is the shortest route that goes through all the different cities. This is
known as the Traveling Salesman Problem. Each specific instance of the problem isparticular
collection of cities or, mathematically speaking, a set of points in the plane. The size of an
instance of the problem, n, is simply the number of cities involved.