New Scientist 17.5.08


ARCHAEOLOGY can tell us plenty about how humans looked and the way they lived tens of thousands of years ago. But what about the deeper questions? Could early humans speak, were they capable of self-conscious reflection, did they believe in anything? Such questions might seem to be beyond the scope of science. Not so. Answering them is the focus of a burgeoning field that brings together archaeology and neuroscience. It aims to chart the development of human cognitive powers. This is not easy to do. A skull gives no indication of whether its owner was capable of speech, for example. The task then is to find proxies for key traits and behaviours that have stayed intact over millennia. Perhaps the most intriguing aspect of this endeavour is teasing out the role of culture as a force in the evolution of our mental skills (see "Biology alone may not have made us human"). For decades, development of the brain has been seen as exclusively biological. But increasingly, that is being challenged.

Take what the Cambridge archaeologist Colin Renfrew calls "the sapient paradox". Evidence suggests that the human genome, and hence the brain, has changed little in the past 60,000 years. Yet it wasn't until about 10,000 years ago that profound changes took place in human behaviour: people settled in villages and built shrines. Renfrew's paradox is why, if the hardware was in place, did it take so long for humans to start changing the world? His answer is that

the software - the culture - took a long time to develop. In particular, the intervening time saw humans vest meaning in objects and symbols. Those meanings were developed by social interaction over successive generations, passed on through teaching, and stored in the neuronal connections of children.

Culture also changes biology by modifying natural selection, sometimes in surprising ways. How is it, for example, that a human gene for making essential vitamin C became blocked by junk DNA? One answer is that our ancestors started eating fruit, so the pressure to make vitamin C "relaxed" and the gene became unnecessary. By this reasoning, early humans then became addicted to fruit, and any gene that helped them to find it was selected for.

Evidence suggests that the brain is so plastic that, like genes, it can be changed by relaxing selection pressure. Our understanding of human cognitive development is still fragmented and confused, however. We have lots of proposed causes and effects, and hypotheses to explain them. Yet the potential pay-off makes answers worth searching for. If we know where the human mind came from and what changed it, perhaps we can gauge where it is going (see "PERSPECTIVES: Who are we becoming?"). Finding those answers will take all the ingenuity the modern human mind can muster.

Human Evolution - Follow the incredible story in our comprehensive special report.


How culture made your modern mind
14 May 2008
Andy Coghlan

IT IS one of the hottest questions of our time: how did our cognitive abilities explode, leaving other animals for dust intellectually?

Now a new explanation is emerging. Controversially, it challenges the idea that biology alone is what drove the evolution of intellectual skills. What if we acquired abilities such as the capacity to invent, converse or work in unison as a result of a continual process of cultural cross-fertilisation with the world we inhabit, and through the way we interact with other people and material things?

Not only does this idea help explain how our species blossomed intellectually in the first place but it implies that our brains are continually changing whenever we meet new cultural concepts, objects and technologies, whether they are cellphones or new religions.

After Homo sapiens emerged about 200,000 years ago, it took around 140,000 years before any sign of modern civilisation emerged. So what happened that finally turned Stone Age boneheads into whizz-kids capable of creating stone tools, painting cave art and arranging burial rites for the dead? Various researchers explored this question at a historic meeting in September 2007 in Cambridge, UK, entitled "The Sapient Mind". Their discussions and conclusions were published last month in a collection of scientific papers in Philosophical Transactions of the Royal Society B.

A number of the researchers who contributed papers think that up to a certain point in history, biological factors alone controlled our brain's development. Then around 60,000 to 70,000 years ago, the biology and structure of our brains stopped changing and other factors began to take over as the main driver of human development.

For this to happen, however, the biological groundwork needed to be in place, they say. One of those biological foundations may have been the gradual expansion of working memory, which eventually enabled us to retain memories from the past, recognise objects in the present and plan ahead and conceive of a future (see "We're streets ahead").

The second was the emergence of a "theory of mind", which is the realisation that other creatures are intelligent and capable of independent thought and intention. It derives from the activity of "mirror" systems in the brain which enable an observer to feel the experiences of others, and to divine their intentions and motives.

G√ľnther Knoblich of the University of Birmingham, UK, argues that a theory of mind and the capacity to separate the intentions of others from our own was a critical neurological breakthrough. It enabled humans to cooperate on tasks such as using the combined strength of several people to move a heavy object or hunt together. Even though these feats may have been accomplished without language and with the help of nothing more than gestures, signals and facial cues, they go way beyond simple mimicry of others, he says, which many animals can do.

Yet perhaps the biggest opportunity opened up by a theory of mind and an expanded working memory was the ability to learn, and to systematically educate other people. Animals learn by random observations of what other animals do. It is very seldom that they recognise the value of an innovation by their peers and then copy it themselves, such as shaking a tree to make fruit fall.

But thanks to theory of mind and the ability to divine the intentions of others, humans were able to train their offspring. During the process of teaching, both pupils and teachers are well aware of what's happening and know they must pay special attention beyond random observation. What's more, as working memory expanded, learning would have become more efficient.

This may have allowed us to steal a march on other species and our close relatives. "As far as I know, chimps don't teach each other," says Chris Frith of University College London. "So a chimp baby can learn by watching its mother, but does not explicitly get instructed. Nor is there teaching to take account of the lack of knowledge of the infant," he says.

Once teaching became possible for humans, differences between cultural groups would have begun to emerge, because different rules and traditions would have slowly been established in different communities, and passed down to consolidate social unity. This means that specific cognitive skills would have developed and would have dictated what elements of behaviour and achievement were seen as important by that culture.

As a modern example, take the case of Dauya, a magician-cum-astronomer from a tribe on Boyowa Island in Papua New Guinea. Interviewed in 1976 by Edwin Hutchins of the University of California, San Diego, who contributed one of the papers, Dauya had been taught the task of notifying his fellow islanders when the Pleiades constellation becomes visible at dawn from a particular beach on Boyowa Island. When it does, that's the time for islanders to plant their crops, because the tribe has learned after many years of observation that it coincides with the start of the growing season, even when the weather suggests that time may not be right. Dauya's task and the fact that he has been taught it is a considerable intellectual achievement, considering the tribe is pre-literate. Yet this skill is unique to the cultural context of communities on Boyowa Island, and has only developed because of it.

Our interactions with material objects and surroundings also influenced the capabilities of the brain since a theory of mind and working memory became established, according to Chris Gosden of the University of Oxford.

Perhaps the best modern-day evidence that objects and surroundings influence brain structure and function is seen in London taxi drivers, who develop enlarged hippocampuses to accommodate their abnormally large mental street map of London. "It's the condition of actually doing things that cumulatively shapes our thoughts and shapes our brains," says Gosden.

Dwight Read, an anthropologist at the University of California, Los Angeles, and colleagues argue a pivotal moment in human development came about 10,000 years ago, when our relationship with objects and material things changed significantly. Around this time, groups of hunter-gatherers began to replace their hunting spears with domesticated plants and animals, and to settle in a particular location. This meant they recognised the concept of "inside" and "outside" a living space and could create individually worked fields and enclosures for livestock and crops.

"Consequently, problem-solving rather than moving to a new location became the key to survival," he says. Once that had happened, the ability to teach systematically, to work together to solve problems, and the brain's capacity to adapt to cultural change truly came into its own, and any hold biology still had was released. After that, it seems, our ingenuity just exploded.

The Human Brain - With one hundred billion nerve cells, the complexity is mind-boggling. Learn more in our cutting edge special report.

Human Evolution - Follow the incredible story in our comprehensive special report.

From issue 2656 of New Scientist magazine, 14 May 2008, page 8-9

The right tools for language

Did the explosion in tool-making coincide with the emergence of language?

Dietrich Stout and colleagues at University College London took brain scans as three archaeologists skilled in making Stone Age tools practised their craft. They found that parts of the brain that became active coincided with areas vital for language. "Putting together a complicated sentence and making a tool are similar challenges, and so the underlying process is similar," says Stout. "It's calling on overlapping parts of the brain."

Likewise, Scott Frey of the University of Oregon found after studying patients with brain injuries, and scanning healthy individuals, that areas of the left hemisphere are crucial for planning the use of familiar tools. These areas also appear to underlie our ability to perform symbolic gestures, which some suggest may pre-date the evolution of speech. "The specific areas showing increased activity in healthy patients are those most likely to be damaged in patients with apraxia, a deficit in performance of manual skills," says Frey.

We're streets ahead in the memory game

Compare our genes with a chimp's, and there's not much difference. Yet it's a different story when you look at "working memory".

Located in the prefrontal and parietal cortices, the "thinking" parts of the human brain, working memory enables us to link the past and present, and allows us to conceive of a future. No other species has developed this capacity so completely as humans, and early on it may well have allowed us to steal a march on our most recent ancestors.

Dwight Read, an anthropologist at the University of California, Los Angeles, and colleagues reckon our working memory underwent a gradual expansion till it reached a critical point about 60,000 years ago, when cognitive abilities such as systematic learning took off.

He and co-author Sander van der Leeuw of Arizona State University in Temple, Arizona, base their assertions on comparisons of working memory in chimpanzees and modern-day humans. They estimated working memory capacity in chimpanzees from measurements of their performance in tasks such as cracking nuts with stones. For modern humans, they estimated memory capacity from published accounts of how babies' capabilities expand until they reach puberty. On a nominal scale of 1 to 7, humans reach a memory capacity of about 7 by the age of 12, whereas chimps seldom get beyond 2 or 3 on the scale.

The upshot is that in terms of tool use and manufacture, chimps simply can't compete with the accomplishments of modern humans or, by implication, our ancestors of 60,000 years ago, who had comparable brains to ours biologically. They can't learn as efficiently, for instance. So whereas our ancestors could have easily learned to crack open nuts, a quarter of chimpanzees can't learn to do this, however many times they see the task performed.