2010 Oct 29th
A reader asked me what I thought about the limits of defining large numbers.
Such discussions begin with specific arithmetic operations and mathematical symbols in mind, and usually focus on comparing one system (such as Conway’s “chained arrow notation“) to another (such as “Bowers’ extended operators“). The choice of symbols and operations affects how high one can go, and such discussions usually devolve into competitive games, the limits of which are fairly well handled by the Turing machine and the Lin/Rado “busy beaver function“.
But such discussions usually come out of a more universal question, which regards the limits of human thought and perception in general.
Limits of human thought and perception are apparent throughout the history of numbers and mathematics. After a survey of early human developments (such as is presented in the nearly exhaustive “Universal History of Numbers” by Georges Ifrah, ISBN 0-471-37568-3) one might notice some patterns:
- Perception and understanding are limited by the symbols in use and the concepts they represent,
- Mastery of a given set of concepts leads to invention of new symbols and concepts.
At any point in history, or within any specific culture, there is a specific set of ideas and symbols which creates (or perhaps reflects, or both?) a natural limit of the capacity of the mind to perceive (say) large finite numbers.
It has been the trend throughout our history that the intellectual developments of earlier generations become assimilated into the body of common knowledge and added to the standard educational curriculum. As new material is added, earlier material is often compressed and taught (usually with greater efficiency) in a shorter period. So it is that the most advanced arithmetic of the early Babylonians is surpassed by that learned by today’s 8- and 9-year-old students, and most of the algebra techniques of 9th century Arabia are (typically) learned by 13- or 14-year-olds today, and so on. Both are aided by more recent developments (Indo-Arabic numerals aid arithmetic; certain new teaching methods address the abstraction of variables in algebra, etc.)
Speculating about the limits of the human mind (or brain, for reductionists) can lead to discussions that test or challenge religious beliefs. I suppose the majority opinion in most cultures would state that the human mind has some kind of ultimate limit, which can be compared to the limited physical size of the human brain. (Such a conclusion helps to distinguish believers from God, avoiding blasphemy).
A universe, assuming it is also limited in size (or a visible universe as limited by an event horizon or light cone) would therefore also have a finite limit.
The development of our culture over thousands of years is a bit like an expanding light cone. The contraction of the curriculum into ever-shorter stretches of childhood is like the Lorentz contraction of galaxies known to be much further away, and therefore seen in a remote past, when the universe and the visible universe (our view of the world and the sum total of knowledge) were both much smaller.