The Alphabet of Thought
The Alphabet of Thought
An alphabet consists of a relatively small number of letters correlated with simple sounds. The modern English alphabet (deriving from the Latin alphabet) has 26 letters. The sounds represented are those found in speech, so the alphabet is a way to code the sound structure of speech. Writing consists of strings of letters that correspond to the sounds of spoken language. Any sentence of a language can be represented by an alphabet—infinitely many such sentences. Words can be represented as sequences of letters in combination. We can also say that spoken language has an alphabet—the collection of basic sounds that make up vocal utterances. These too are relatively few in number (around 30), as dictated by the human articulatory system. Clearly combination plays a large role in allowing these primitive elements to generate so many meaningful strings. A notable, indeed defining, feature of an alphabet is that the basic elements are not themselves meaningful: the sounds and marks that compose an alphabet have no meaning in isolation (with very few exceptions). For example, “red” consists of three letters and three sounds (phonemes), corresponding to “r”, “e”, and “d”, none of which have any meaning independent of their meaning in combination. We might call this “the principle of non-semantic composition”, or “the principle of alphabetical composition”. Phrases and sentences obey a principle of semantic composition because words are meaningful units, but words (spoken or written) are made of elements that are not meaningful in their own right. Meanings are not being combined when words are formed from marks or sounds. This is what enables an alphabet to do its work: if the principle of combination were purely semantic, we would soon see an explosion of primitive elements—as many as there are basic meanings. Such an alphabet would be unwieldy to use and a strain on the memory. So we make do with 30 or so basic elements and let combination do the job of generating the infinitely many sentences that language contains. Speech and writing are thus economical as to primitives but fertile as to combinations—a great many meanings and a handful of letters or sounds.
The question I am interested in is whether thought works alphabetically: does it too have an alphabet in the sense just outlined? Let’s assume there is a language of thought: then the question is whether this language is composed of alphabetic elements in the way speech and writing are so composed. We already know it is composed of words—so much is implicit in calling it a language—but it is a further question whether the words themselves have an alphabet-like composition. Are the internal words made up of mental letters or mental sounds? Are they spelled a certain way? I don’t mean letters of the current English alphabet (though this cannot be ruled out on logical grounds) or actual sonic events (that would make thought noisy); I mean does it contain anything comparable to these alphabets. That is, are the words of LOT composed non-semantically? Is there an alphabet for the language of thought? Given that thoughts are made up of concepts, this is the question of whether concepts are expressed in internal versions of an alphabet? Does the concept red, say, carry with it components corresponding to “r”, “e”, and “d”? Does the word of LOT that represents that concept have components that are themselves not conceptually significant? The word of LOT that represents the concept red is a meaningful word, but is this word composed of meaningless elements in the manner of an alphabet? Is thought alphabetically represented in the mind? When you think are you somehow uttering “sounds” or making “marks” that have no individual meaning? Does LOT work with about 30 such elements, using them to construct infinitely many mental representations?
We have no direct evidence that this is so. There aren’t any letters written on the brain that we can decipher, and no audible susurrations issuing from the cortex; nor has anyone compiled a list of alphabetic thought elements. Neither is it true that we can introspect the alphabetic components of LOT, as we can hear and see the sounds and letters that make up our ordinary alphabet. But presumably that is not to be expected given that LOT is an unconscious mental reality; and we don’t generally insist that every decent psychological construct be directly detectable. Maybe we could find indirect evidence for such an alphabet, even identifying its constituents; or maybe not, even though it is a psychological fact. Still, we can inquire into whether there are any plausibility considerations that might favor the idea. First, we have the precedent supplied by spoken language: it is accompanied by an alphabetic system, so why not the internal language? This might conceivably be true of outer language even though we couldn’t directly detect the sounds and marks that constitute the alphabet, perhaps because they are hidden away somehow and only manifest themselves in acts of communication (going straight into the brain without any sensory representation).[1] Second, we know that concepts have alphabetic vehicles, since they are expressed by spoken and written words, so it is perfectly possible for them to have internal such vehicles too. Third, not having an alphabetic structure is massively inconvenient—it would require the brain to have distinct primitive symbols for every basic concept. Why not take advantage of the combinatorial powers supplied by an alphabet? This will give us computational economy and efficiency—the brain only has to manipulate about 30 basic representational units. Fourth, inner speech presumably mirrors outer speech by deploying a parallel internal alphabet: when we say “red” to ourselves silently we mentally rehearse bits corresponding to “r”, “e”, and “d” (you can perform this experiment on yourself now). If so, granted the close connection between inner speech and thought, it is a small step to recognize the same structure in the realm of what we call pure thought. Maybe what happens here is just that the alphabet goes under ground while still chugging away imperceptibly. Fifth, the idea of a language without an alphabet is not easy to make realistic sense of: the speaker needs a manageable system of elements with which to form words or else words become difficult to produce and understand, or else become few in number. Speech and writing work so well because they operate from a few primitives with a small number of combinatorial operations: thus they are enormously repetitive, which is helpful. LOT should avail itself of the same convenience.
But what are these elements, you might wonder. What are the non-semantic units that combine to form semantic units? They have to exist in the brain and be capable of impressive feats of rapid dexterity. Electrical patterns! We know how closely electrical activity in the brain maps onto mental activity, so why not suppose that words of LOT are composed of electrical patterns that combine to generate a meaningful word? There have to be such electrical patterns when a word in LOT is tokened, so why not accept that there are constituent such patterns corresponding to the alphabetic structure of the word? In other words, the “r” part of “red” corresponds to an electrical pattern that combines with other electrical patterns to generate the complex electrical pattern corresponding to “red”. This is non-semantic composition—alphabet-like word construction. Those elementary electrical configurations can recur in the employment of other internal words, as the same sound or letter can appear in different words; it is the combination that produces the specific meaningful word in question. Thus the LOT word for “red” is made up of a combination of electrical patterns that function like letters and phonemes. Of course, there may also be higher-level descriptions of these electrical constituents that we don’t now know about, but at a basic level the brain is producing electrical patterns that act as “hardware” for the abstract alphabet (“software”) associated with LOT. The important point is that there is non-semantic composition in LOT as well as semantic composition—just like spoken and written language. LOT’s alphabet is electrical in nature with charge and voltage corresponding to sound and shape (at least at the basic level).
So it is not unreasonable to credit LOT with an alphabetic architecture, just like speech and writing. What this essentially means is that it obeys a principle of non-semantic composition. Does anything else obey such a principle? Why, intentional action does: it too proceeds by constructing complex units from units not themselves intentional actions. Whenever you perform an intentional action there are countless bodily events that you don’t also intend but which combine to produce what you do intend. Actions are not composed only of other actions, as words are not composed only of other words. In both cases there is a generative system that manufactures one sort of thing from another sort of thing—actions from non-intentional bodily events (e.g. nerve impulses innervating muscles); words from non-meaningful sounds or marks. We might even postulate a psychological law—the law of non-mental to mental composition. Just to be a bit snappier, let’s call it “the law of alphabetical composition”. This law says that certain things—speech, writing, thought, and intentional action—are constructed from elements drawn from outside these domains. To be more exact, there are two phases of construction: first, we construct an instance of these categories from elements that don’t fall within them; second, we construct further such instances using previously constructed instances. For example, we make spoken words from meaningless sounds and then we make further word-like items (phrases and sentences) from words. Words in LOT are likewise made from non-words by alphabetic procedures, and then these words are used to construct further meaningful units. Better, the vehicles of meaning are the result of two generative processes: those that produce meaningful vehicles from non-meaningful vehicles, and those that produce meaningful vehicles from other meaningful vehicles. Perception might obey the same law, producing percepts from elements not themselves percepts; and these elements too might have alphabetic structure. It is often remarked what a marvelous invention the alphabet was, making written records finally available and usable; well, evolution seems to have been there before us, exploiting alphabetic structure in the management of speech, thought, action, and perception. The idea of using a pre-semantic reality consisting of a small number of primitives in order to represent an infinite totality of semantic units is just too good an idea to pass up. The mind thus evolved as an alphabetic machine, and we cottoned onto this late in the game when we invented the first written alphabets.
There is a tendency to think that thought is like language but with the words removed leaving only the pure proposition. But this has to be wrong because the mind and the brain need a medium of representation for thought to occur in; they can’t just grasp a meaning without any mediation. The language of thought is brought in to supply such a medium, but once we take that step we must ask the question of the nature of its composition; and once we do that the idea of an alphabet-like structure starts to take shape. The internal language must be like the external language in this respect. Thus we end up with the idea that the words of LOT are composed like the words of speech and writing—combinations of a limited number of basic meaningless elements, such as sounds and shapes are. Thought is alphabetically organized too.[2]
[1] We might have blindsight and deafhearing that prevent us having any conscious perception of marks and sounds, and yet those things have their impact on our nervous systems.
[2] A spoken or written language can be segmented according to two sorts of principle: semantic (which includes syntactic) and phonetic or figural. These are independent principles; both are essential to human language as we find it. Sentences sound or look a certain way as well as mean something or other. Similarly, we should think of the sentences of LOT as having these two dimensions: segmentation by meaning and segmentation by intrinsic character. The latter is where alphabetic structure comes in.
Leave a Reply
Want to join the discussion?Feel free to contribute!