# Addition and the Origin of the Human Mind

Addition and the Origin of the Human Mind

How did language and arithmetic evolve? [1] It is natural to ask about both in the same breath because of certain broad similarities between the two, particularly regarding discrete infinity, recursive rules, and computation. It would be nice if a common feature could be revealed allowing both to have the same origin. This would also provide an identical explanation for the learnability of arithmetic and language: the same basic cognitive mechanism is responsible for acquiring both sorts of competence, suitably specialized. The idea is that a single mutation, occurring around 200,000 years ago, provided the human brain with the cognitive machinery to grasp both the syntactic structure of language and the structure of arithmetic. No doubt this basic machinery got supplemented and shaped by the demands of externalization and other factors, but the core principle evolved in a single genetic mutation encoding an instruction for the construction of human brains. A new brain circuit implementing a cognitive trick or trait sufficed to permit the arrival of arithmetic and language. Thus the specifically human mind evolved as an upshot of this remarkable mutation; and the rest is history. The question is what this magical mutation might be. It needs to be both simple enough to evolve in the standard manner and yet rich enough to encompass the essence of the competences it permits. This is no doubt a daunting question, but presumably it has an answer—and we might as well set about trying to answer it. So: what structural, operational principle lies behind both arithmetic and language?

The answer I will propose is: *addition*. We should first rid our minds of the usual connotations of that word, namely school sums written with the plus sign. The *OED* gives this for “add”: “join to or put with something else”. Notice this does not even mention numbers specifically; it is a very general operation of joining or combining different things. The mathematical sense of “add”, as we now understand it, is given by the *OED* as “put together (two or more numbers or amounts) to calculate their total value”. Roughly, then, addition is an operation of joining different things to form a whole—as in joining 3 and 5 to get 8. What is the analogue in the case of language? Conjunction, of course—in the narrow logician’s sense and in the wider grammatical sense. In the logician’s sense the word “and” works to conjoin two sentences to deliver a certain truth table: one sentence is *added* to another to produce a larger sentence true if and only of both conjoined sentences are true. The truth-value of the whole may be said to incorporate the truth-values of the conjoined sentences, rather as 8 incorporates 5 and 3. In the grammarian’s sense conjunction is not limited to “and” and its synonyms: the *OED* gives “a word used to connect clauses or sentences or to coordinate words in the same clause (e.g. *and*, *if*)”. So disjunction is a type of conjunction: it is a way to add sentences to other sentences. In fact, the concatenation operation is itself just another type of addition: in a sentence or phrase words are joined together by an operation of addition (“concatenate”: “link together in a chain or series”, *OED*). This operation has infinite potential. It is clearly part of our linguistic competence, even though it may be unconscious and automatic. But the same is true of arithmetical addition: our mathematical competence is likewise predicated on a grasp of numerical addition, which may also be unconscious and automatic. So there is a factor in common here: a principle of addition that takes us from one set of elements to another—a joining together of parts into wholes. The hypothesis, then, is that mastery of this operation lies behind the origins of our human mastery of language and arithmetic. In short, there was a mutation for addition (the cognitive competence) and this is what allowed arithmetic and language to get off the ground. It was like the development of an aerodynamic wing (in both biological evolution and aircraft technology).

There cannot be much doubt that addition is fundamental to arithmetic. As the mathematics textbooks say, subtraction is just the inverse operation to addition (it “undoes” addition), and any subtraction formula can be rewritten as an addition formula. Multiplication and division calculations likewise involve addition. When someone grasps the concept of addition he or she grasps the concept of subtraction: what can be added can also be taken away. If you can add 3 to 5 to get 8, you can also subtract 3 from 8 to get 5: the two concepts are intertwined. Also, each number can be viewed as the continued addition of 1 to 1, or some other type of addition of integers. Isn’t arithmetic really the systematic study of addition? The natural number series is just one long addition; the successor function simply adds 1 to the preceding number taken as argument. There is no need to labor the point: addition is the lifeblood of arithmetic. In the case of language, we are not adding numbers, but we are adding another type of unit—what we call a word, a unit of meaning. This is not just a matter of uttering words in temporal sequence; it is a more abstract mental operation, often carried out entirely inside the mind. It is a compositional process analogous to numerical addition (which may involve adding amounts of stuff not merely numbers). The suggestion, then, is that this additive compositional process might be the foundation on which mature language and arithmetic are based. In order to evaluate this proposal I will now list the defining features of addition in the intended sense; it will emerge that addition has specific formal features that suit it to performing such a role. It is a more refined and structured operation that might at first appear: it is both rich and yet primordial—exactly what we need to solve the problem of origins for language and arithmetic.

First, addition is infinitely productive: you can keep on doing it *ad infinitum*. You can keep on adding numbers to numbers to get further numbers, and you can keep on adding words to words to get more words. The word “and” by itself has infinite productivity: you can conjoin sentences and predicates to infinity, but you can also conjoin singular terms, as in “gin and tonic” or “strawberries and cream”. The concatenation function likewise has infinite range, as does our grasp of it (logically it is just like the function expressed by “plus” in arithmetic). In both cases addition operates over discrete entities, thus generating discrete infinities (as opposed to continuous magnitudes). It is no small matter to acquire a capacity to handle such an infinitely productive operation. Connectedly, addition is generative: it generates one thing *from* another. It isn’t passive or static but active and dynamic. Thus we have generative grammar and generative arithmetic—rules that produce something from something else. Third, addition is combinatorial in the sense that it brings things together to produce something new: it isn’t just a brute process of sequencing but the production of a new entity considered as a whole. Adding 5 and 3 produces the number 8, which is not just a sequence (ordered pair) consisting of 5 and 3. Likewise a sentence is a new whole derived by combining parts; it is not just a *list* of words but a new type of linguistic unit. So the agent of such construction must be able to grasp the whole that results from the operation of combination *as* a whole. It isn’t just setting elements side by side but *combining* them. Fourth, and consequently, addition is *ampliative* (Kant’s word) in the sense that it generates something not already present in what is added together; it produces not just arbitrary strings but organic unities (to use the old-fashioned term). New phrases and sentences are unities in their own right, just as numbers are: addition has the power to confer such unity on its outputs. Fifth, this ability is reflected in the creativity of addition: that is, mathematical and linguistic competences consist in a capacity to create brand new wholes, new unities. Even a simple conjunction (“grass is green and the sky is blue”) exhibits this kind of creativity—rather like the production of a number no one has ever thought of before. Addition is not merely “mechanical”: it involves breaking new ground, going where no man has gone before. It might even intersect with creation in the usual sense of exceptional human production—as in writing poetry or discovering a new type of number. Without the ability to “put things together” mentally human creativity in the usual sense would not be possible. It is actually quite a feat to add 5 and 3, and likewise a feat to produce even a simple sentence like “the sky is blue” (adding one word to another till we get the desired result). Sixth, and important, addition is hierarchical: you can add what has previously been added. You can add 3 to 5 and then add the resulting number to another number. The addition operation can be applied cyclically and recursively: this would include adding to the result of a subtraction in order to get a further number. Bracketing becomes necessary for depicting such computations. In the same way language allows for hierarchical structure: we can, for example, conjoin conjunctions (as well as disjunctions etc.). In this respect addition is like Chomsky’s *Merge* operation, which also applies to its own outputs in a hierarchical manner. [2] Indeed *Merge* may be seen to incorporate *Add*, since it involves joining or combining elements to produce a new element: merging *X* and *Y* into *Z* is adding *X* and *Y* to get *Z*. In both operations we have the ability to apply the operation to its own outputs generated at a lower level. Seventh, addition has scope in the logical sense: there is always a question as to what the scope of the addition operation is supposed to be. It is like the scope of quantifiers: not every variable to the right of a quantifier is bound by it, just as not every number following a given one is automatically included within the addition operation. We have conventions, generally expressed by brackets, for indicating scope, and addition needs such conventions in order to avoid ambiguities. Addition is thus selective in its intended scope, not all-inclusive. Eighth, and worth emphasizing, addition is notably liberal in its domain of operation: you can add quite disparate things to each other; similarity is not required. Any number can be added to any number (not just even numbers to even numbers, say), just as any sentences can be conjoined regardless of subject matter. This enables us to transcend natural associations between things: things don’t have to be conjoined in nature to be conjoined in thought. That is how set formation works: a set may contain the Eiffel tower *and* your favorite aunt *and* that dog over there. There is a certain *freedom* to the addition operation: it is not too choosy about what it will combine. This gives it enormous creative power; it liberates thought from the tyranny of nature. A mind possessing it thereby possesses considerable freedom of expression. We should not underestimate the power to put things together *ad libitum* (as well as *ad infinitum*). Finally, addition has an inverse: what can be added can be taken away. Addition is a *reversible* operation. You can add 3 to 5 but you can also subtract it from 8; you can conjoin two sentences but you can also de-conjoin them (as in conjunction elimination in logic). This gives addition flexibility—it isn’t stuck with the wholes it has produced. You can form ever more complex sentences, but you can also simplify sentences by removing parts of them; indeed, this is just the other side of what addition is. Adding and subtracting are parts of the same package.

Putting all these properties together, we can see that addition is by no means a simple matter of setting things side by side like marbles in a drawer. It has subtlety and structure, a rich cognitive profile. Yet it is conceivable that it arose by a relatively localized mutation, producing a distinctive piece of neural rewiring. It might have arisen much like the cornea or the eyelash. But it also has the power to carry us a long way in the manufacture (the engineering) of arithmetic and language (i.e. syntax). A great deal of these two cognitive faculties can be fitted into the framework of addition—that mental operation has considerable power to produce what is characteristic of arithmetic and language. Perhaps not all—other factors no doubt joined with his basic factor—but as a fundamental cognitive principle it is capable of a lot of work. Certainly other animals are lacking in its productive power: they may have primitive communication systems and an elementary grasp of counting, but they don’t have the full structure generated by addition in its abstract form. They have not yet reached the stage of unlimited mental conjunction. What the human mind is particularly good at is forming new wholes by means of addition (composition, conjunction, putting together)—where this is to be understood by reference to the ensemble of features enumerated above. Once this operation evolved in the human brain it was available for use in various worthwhile endeavors such as calculating, thinking, and talking. It arose by chance but it was soon exploited and promoted by natural selection. Thus we became the adding species, the dedicated conjoiners, the arithmeticians and grammarians. Now we add up all the time, constantly using our capacity to put things together, always creating sums. This theory seems like the optimal combination of simplicity and fecundity necessary in any proposal for explaining the evolution and learning of arithmetic and language, with the bonus that both areas fall under the same theoretical framework. Both are offshoots of a primordial ability, arriving a couple of hundred thousand years ago, to perform acts of addition. Arithmetic is the application of addition to numbers, and language (syntax) is the application of addition to words. [3]

[1] This paper was stimulated by things Chomsky has said in various places. It presupposes a lot and is very compressed.

[2] See *Why Only Us* by Robert Berwick and Noam Chomsky (2016), especially 72-4. My suggestion might be viewed as complementary to this but with a different emphasis.

[3] We should note that the theory is not intended to explain the origin of words (lexical elements) or concepts of numbers. It doesn’t even explain the existence of standard grammatical categories. It is solely concerned to explain the most abstract features of language and arithmetic (as the *Merge* operation is supposed to). It tells us how certain structural properties of the two domains may have come into existence. It is basically a theory of the origins of the human combinatorial capacity. Much would need to be added to it to reach arithmetic and language as they now exist in the human species. Still, we do need a theory of the basic cognitive architecture of these two domains that is consistent with their having evolved in the usual way. We need a theory of the basic form of the innate program for acquiring these capacities.

This is just an addition to your Post. The curious question as always is this: What are the foremost distinguishing features of the exercise of human intelligence? One longs for a compelling explanation of the flexibility and creativity of human thought, language and behaviour. But absent a science of the scope and limits of the plasticity of our conceptual competence, iteration, recursion (and addition, as you conceive it) are but combinatorial properties of a cognitive competence in search of meaning,.You’ve said as much .A finite lexicon with a finite number of rules of combination is an explanatory dead-end.

On the contrary, the combinatorial powers of language are not at all trivial. Creativity in the sense of free action and artistic creativity bring in other factors (mostly unknown).