In article <7i4285$sge$1 at news5.svr.pol.co.uk> Simon Marsh,
simonmarsh at tardis99.freeserve.co.uk writes:
>Some neuroscientests estimate that during an average lifespan, a person uses
>only 1/100 of 1% (.0001) of his potentional brain capacity. - Why do we
>have a brain with so much capacity that we hardly test a fraction of it in a
>normal life time?
>
OK, OK. We've all had our fun. Now, here's a serious answer, Simon:
The reason you've gotten so many snide responses to your question is
because most of us "neuroscientists" get asked this same question pretty
often, and we're all frankly bewildered by it, for several reasons.
First, as neuroscientists, we're all at least marginally concerned about
issues like "what is the information capacity of the brain". Problem is,
working in this field for a while has taught us that a) We don't know
what this capacity is; b) We haven't figured out how to measure it; and
c) We haven't even figured out what exactly we mean by "information" or
"capacity" when talking about the brain.
This isn't because we're all stupid. It's because there isn't any simple
or unique way of phrasing these questions or making these measurements.
For example, to measure how much of the "capacity" is actually being
used, one could use the ratio of a single neurons firing rate to its
maximum possible rate (let's think real simple for the moment, and leave
the rest of the brain out of the equation). Under this definition our
neurons would be using hardly any of their capacity, because they usually
don't fire anywhere near as fast as they could. But is this a sensible
definition? Using this definition, the only times our neurons would be
using their full capacity would be during epileptic seizures (hardly what
one thinks of as an efficient use of brain power). On the other hand, we
could try to express "capacity" as Ma Bell does, and define it in terms
of efficient coding of information. Doing this, we'd learn that you can
carry the maximum information (in bits/sec) when you are working at half
or less than half of your maximum rate. But this totally begs the
question of what we mean by "information". There are probably a hundred
different ways of quantitatively defining this term: Shannon Information,
Fischer Information, Kolmogorov Complexity, Algorithmic Information, etc,
etc, etc... and none of these are the same as what we usually mean in
everyday language when we say "information". We usually mean, well, er...
"meaning"! In the semantic sense. Whereas most of the quantitative
definitions explicitly don't mean that. They are aiming instead at a
formal mathematical description of an arbitrary measurable quantity. So
this business of "brain capacity" is a completely wide open gap in our
knowledge. We haven't even figured out exactly how to ask the question,
let alone answer it.
It's therefore hardly surprising that we get a little testy when we see
Deepak Chopra or whoever bandying about statements like "neuroscientests
estimate that during an average lifespan, a person uses only 1/100 of 1%
(.0001) of his potentional brain capacity". Most of us would be only too
thrilled to learn that someone had actually figured out a way to make
such a statement accurately and meaningfully. But so far, it's just
hooey. And anyone who says otherwise is probably selling something.
Cheers,
Matt Jones