Working through Christine Kenneally’s „The First Word“ made me rethink many of the ideas that I have learnt and collected over the years.
Like all linguists of my generation I learnt a lot from Noam Chomsky. Having been brought up on Ferdinand de Saussure‘s Cours de Linguistique Générale and the idea that a linguist should study „langue“ – the structure of language – and not „parole“, it was clear to me that Chomsky’s „competence-performance“ distinction was a natural guiding beacon. What I could not accept was Chomsky’s idea of the centrality of „syntax“. For me language was all about meaning. I could not believe that meaning was a result of parsing sentences. Bedeutungslehre – semantics was central to understanding linguistic utterances.
Studying meaning made me begin to doubt another of Chomsky’s assumptions – modularity. Neuroscientists could not find a language module, what they found was a complex web of groups of neurons covering the whole brain. In the books of Gerald Edelman I found that „the computer metaphor“ was wrong – the brain does not work like a digital computer, it evaluates. Emotional evaluation is central to our understanding language. (cf. Varela Ethical Know-how)
Linguistics: The branch of science that I had studied as a young man changed many of its basic assumptions over the past thirty years:
From studying "structure" - "langue" - "competence" -
to describing "processes" - "parole" - "performance".
We need to study "speech" and how this medium of human communication evolved.
Kenneally122 : A child's ability to learn many words is so completely different from anything observed in other species that many researchers propose that some neural mechanism must be specially dedicated to this acquisition of linguistic knowledge. Beyond the basic link between an unanalysed sound and a simple reference in the world, words are clusters of complex knowledge about sound, grammar, and meaning. Human words don't exist by themselves. They are points in a series of intersecting systems, and when you hear a producer words, all these systems coming to play. Recent research has shown that when children acquire words, they are not just creating a multidimensional connection between different kinds of linguistic and nonlinguistic knowledge based on a platform of sound and meaning. The essential scaffolding for word learning is more complicated than that. As well as a connection between two domains, such as the aural and the visual, there is a very important connection between speaking words and gesturing meaning.
Maturana27: We human beings exist as observers in language as we operate in the domain of structural coupling to which we belong.
Language is a manner of living together in a flow of coordinations of coordinations of consensual behaviours that arises in the history of living in the collaboration of doing things together.
We human beings exist and operate as human beings as we operate in language: languaging is our manner of living as human beings.
Language occurs in the actual flow of coordinations of coordinations of behaviours, not in any particular gesture, sound or attitude taken outside of that flow. It is like the movement seen in a film that exists as such only as long as the film runs. We human beings language while operating in the domain of structural coupling in which we coexist as languaging beings with other languaging beings. (Maturana - Communication Theory - Media Theory)
From studying "language-production" - "generative syntax" - to investigating "language-comprehension" - "cognitive semantics".
Human beings learn to understand language, to listen to language before they attempt to speak. We need to study the processes of understanding.
Tomasello: Usage-based theories hold that the essence of language is its symbolic dimension, with grammar being derivative. The ability to communicate with conspecifics symbolically (conventionally, intersubjectively) is a species-specific biological adaptation.
...in contrast to generative grammar and other formal approaches, in usage-based approaches the grammatical dimension of language is a product of a set of historical and ontogenetic processes referred to collectively as grammaticalizution.
The implications of this new view of language for theories of language acquisition are truly revolutionary. If there is no clean break between the more rule-based and the more idiosyncratic items and structures of a language, then all constructions may be acquired with the same basic set of acquisitional processes - namely, those falling under the general headings of intention-reading and pattern-finding.
...the adult endpoint of language acquisition comprises nothing other than a structured inventory of linguistic constructions, a much closer and more child-friendly target than previously believed. These two new advances in developmental psychology and usage-based linguistics thus encourage us to pursue the possibility that we might be able to describe and explain child language acquisition without recourse to any hypothesized universal grammar.
( Tomasello - Edelmann - Maturana - cognitive semantics)
From assuming an innate "language module" - to studying how children learn to understand languages.
I learnt that the ideas of 20th century linguists about specialised parts of human brains that came into existence by a major genetic mutation are not supported by findings of brain-scientists.
Tomasello: In general, in my opinion, many theorists are much too quick to explain uniquely human cognitive skills in terms of specific genetic adaptations - typically without any genetic research, it should be added. It is a popular procedure mainly because it is so quick, easy, and unlikely to be immediately refuted by empirical evidence. But another important reason for many theorists' tendency to posit innate cognitive modules as a method of first resort is a lack of appreciation of the workings of human cultural-historical processes, that is, processes of sociogenesis, both in the sense of their direct generative powers and in the sense of their indirect effects in creating a new type of ontogenetic niche for human cognitive development. And, importantly, historical processes work on a completely different time scale than evolutionary processes (Donald, 1991).
From regarding the brain as a digital computer - a trivial machine - to investigating the brain's "plasticity", its "learning capacity". The brain is a non-trivial machine
Edelman: The brain is not a computer:
Our quick review of neuroanatomy and neural dynamics indicates that the brain has special features of organization and functioning that do not seem consistent with the idea that it follows a set of precise instructions or performs computations. We know that the brain is interconnected in a fashion no man-made device yet equals.
First, the billions and billions of connections that make up a brain's connections are not exact: If we ask whether the connections are identical in any two brains of the same size, as they would be in computers of the same make, the answer is no. At the finest scale, no two brains are identical, not even those of identical twins. Although the overall pattern of connections of a given brain area is describable in general terms, the microscopic variability of the brain at the finest ramifications of its neurons is enormous, and this variability makes each brain significantly unique. These observations present a fundamental challenge to models of the brain that are based on instruction or computation. As we shall see, the data provide strong grounds for so-called selectional theories of the braintheories that actually depend upon variation to explain brain function.
(Varela - Heinz von Foerster - Gerald Edelman)
Working through Christine Kenneally’s „The First Word“ made me rethink many other ideas on language and language evolution. You can find the results in Kenneally-Updating II: New arguments.